Feb 20 16:31:31 crc systemd[1]: Starting Kubernetes Kubelet... Feb 20 16:31:31 crc restorecon[4690]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:31 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 16:31:32 crc restorecon[4690]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 16:31:32 crc restorecon[4690]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 20 16:31:32 crc kubenswrapper[4697]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 16:31:32 crc kubenswrapper[4697]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 20 16:31:32 crc kubenswrapper[4697]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 16:31:32 crc kubenswrapper[4697]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 16:31:32 crc kubenswrapper[4697]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 20 16:31:32 crc kubenswrapper[4697]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.635659 4697 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639247 4697 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639270 4697 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639276 4697 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639291 4697 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639297 4697 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639303 4697 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639308 4697 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639314 4697 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639320 4697 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639325 4697 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639330 4697 feature_gate.go:330] unrecognized feature gate: Example Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639335 4697 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639340 4697 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639345 4697 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639350 4697 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639356 4697 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639361 4697 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639366 4697 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639371 4697 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639377 4697 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639384 4697 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639389 4697 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639394 4697 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639401 4697 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639406 4697 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639411 4697 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639416 4697 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639420 4697 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639425 4697 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639449 4697 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639454 4697 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639460 4697 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639465 4697 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639471 4697 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639476 4697 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639482 4697 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639487 4697 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639494 4697 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639501 4697 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639506 4697 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639512 4697 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639519 4697 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639525 4697 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639532 4697 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639537 4697 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639543 4697 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639547 4697 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639552 4697 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639557 4697 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639564 4697 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639571 4697 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639577 4697 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639583 4697 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639588 4697 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639595 4697 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639600 4697 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639607 4697 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639612 4697 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639617 4697 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639623 4697 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639628 4697 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639632 4697 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639638 4697 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639643 4697 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639648 4697 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639653 4697 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639659 4697 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639664 4697 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639668 4697 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639673 4697 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.639678 4697 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639823 4697 flags.go:64] FLAG: --address="0.0.0.0" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639838 4697 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639848 4697 flags.go:64] FLAG: --anonymous-auth="true" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639855 4697 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639864 4697 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639870 4697 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639879 4697 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639886 4697 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639892 4697 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639898 4697 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639924 4697 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639931 4697 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639938 4697 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639944 4697 flags.go:64] FLAG: --cgroup-root="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639949 4697 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639956 4697 flags.go:64] FLAG: --client-ca-file="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639962 4697 flags.go:64] FLAG: --cloud-config="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639968 4697 flags.go:64] FLAG: --cloud-provider="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639974 4697 flags.go:64] FLAG: --cluster-dns="[]" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639982 4697 flags.go:64] FLAG: --cluster-domain="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639988 4697 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.639994 4697 flags.go:64] FLAG: --config-dir="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640000 4697 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640006 4697 flags.go:64] FLAG: --container-log-max-files="5" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640014 4697 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640020 4697 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640026 4697 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640032 4697 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640037 4697 flags.go:64] FLAG: --contention-profiling="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640043 4697 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640050 4697 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640057 4697 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640063 4697 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640071 4697 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640078 4697 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640084 4697 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640090 4697 flags.go:64] FLAG: --enable-load-reader="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640096 4697 flags.go:64] FLAG: --enable-server="true" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640101 4697 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640108 4697 flags.go:64] FLAG: --event-burst="100" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640114 4697 flags.go:64] FLAG: --event-qps="50" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640120 4697 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640126 4697 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640131 4697 flags.go:64] FLAG: --eviction-hard="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640138 4697 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640144 4697 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640150 4697 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640156 4697 flags.go:64] FLAG: --eviction-soft="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640161 4697 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640169 4697 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640175 4697 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640181 4697 flags.go:64] FLAG: --experimental-mounter-path="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640186 4697 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640193 4697 flags.go:64] FLAG: --fail-swap-on="true" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640198 4697 flags.go:64] FLAG: --feature-gates="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640206 4697 flags.go:64] FLAG: --file-check-frequency="20s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640211 4697 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640230 4697 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640235 4697 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640242 4697 flags.go:64] FLAG: --healthz-port="10248" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640248 4697 flags.go:64] FLAG: --help="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640254 4697 flags.go:64] FLAG: --hostname-override="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640260 4697 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640265 4697 flags.go:64] FLAG: --http-check-frequency="20s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640271 4697 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640277 4697 flags.go:64] FLAG: --image-credential-provider-config="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640282 4697 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640288 4697 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640294 4697 flags.go:64] FLAG: --image-service-endpoint="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640300 4697 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640305 4697 flags.go:64] FLAG: --kube-api-burst="100" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640311 4697 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640317 4697 flags.go:64] FLAG: --kube-api-qps="50" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640322 4697 flags.go:64] FLAG: --kube-reserved="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640327 4697 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640333 4697 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640338 4697 flags.go:64] FLAG: --kubelet-cgroups="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640344 4697 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640350 4697 flags.go:64] FLAG: --lock-file="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640355 4697 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640361 4697 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640367 4697 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640376 4697 flags.go:64] FLAG: --log-json-split-stream="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640381 4697 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640387 4697 flags.go:64] FLAG: --log-text-split-stream="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640393 4697 flags.go:64] FLAG: --logging-format="text" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640398 4697 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640405 4697 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640410 4697 flags.go:64] FLAG: --manifest-url="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640415 4697 flags.go:64] FLAG: --manifest-url-header="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640423 4697 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640449 4697 flags.go:64] FLAG: --max-open-files="1000000" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640457 4697 flags.go:64] FLAG: --max-pods="110" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640463 4697 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640469 4697 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640475 4697 flags.go:64] FLAG: --memory-manager-policy="None" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640481 4697 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640486 4697 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640492 4697 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640502 4697 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640516 4697 flags.go:64] FLAG: --node-status-max-images="50" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640522 4697 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640527 4697 flags.go:64] FLAG: --oom-score-adj="-999" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640534 4697 flags.go:64] FLAG: --pod-cidr="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640540 4697 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640549 4697 flags.go:64] FLAG: --pod-manifest-path="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640555 4697 flags.go:64] FLAG: --pod-max-pids="-1" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640560 4697 flags.go:64] FLAG: --pods-per-core="0" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640567 4697 flags.go:64] FLAG: --port="10250" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640572 4697 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640578 4697 flags.go:64] FLAG: --provider-id="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640583 4697 flags.go:64] FLAG: --qos-reserved="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640589 4697 flags.go:64] FLAG: --read-only-port="10255" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640595 4697 flags.go:64] FLAG: --register-node="true" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640600 4697 flags.go:64] FLAG: --register-schedulable="true" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640606 4697 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640617 4697 flags.go:64] FLAG: --registry-burst="10" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640623 4697 flags.go:64] FLAG: --registry-qps="5" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640629 4697 flags.go:64] FLAG: --reserved-cpus="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640635 4697 flags.go:64] FLAG: --reserved-memory="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640643 4697 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640649 4697 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640655 4697 flags.go:64] FLAG: --rotate-certificates="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640661 4697 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640667 4697 flags.go:64] FLAG: --runonce="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640672 4697 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640678 4697 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640683 4697 flags.go:64] FLAG: --seccomp-default="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640689 4697 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640694 4697 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640700 4697 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640708 4697 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640714 4697 flags.go:64] FLAG: --storage-driver-password="root" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640720 4697 flags.go:64] FLAG: --storage-driver-secure="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640725 4697 flags.go:64] FLAG: --storage-driver-table="stats" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640731 4697 flags.go:64] FLAG: --storage-driver-user="root" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640736 4697 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640742 4697 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640748 4697 flags.go:64] FLAG: --system-cgroups="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640754 4697 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640764 4697 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640769 4697 flags.go:64] FLAG: --tls-cert-file="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640775 4697 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640782 4697 flags.go:64] FLAG: --tls-min-version="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640788 4697 flags.go:64] FLAG: --tls-private-key-file="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640794 4697 flags.go:64] FLAG: --topology-manager-policy="none" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640800 4697 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640806 4697 flags.go:64] FLAG: --topology-manager-scope="container" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640811 4697 flags.go:64] FLAG: --v="2" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640819 4697 flags.go:64] FLAG: --version="false" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640827 4697 flags.go:64] FLAG: --vmodule="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640835 4697 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.640841 4697 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.640971 4697 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.640977 4697 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.640983 4697 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.640989 4697 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.640994 4697 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641000 4697 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641005 4697 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641010 4697 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641015 4697 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641020 4697 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641027 4697 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641032 4697 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641037 4697 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641042 4697 feature_gate.go:330] unrecognized feature gate: Example Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641047 4697 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641052 4697 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641056 4697 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641061 4697 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641066 4697 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641072 4697 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641078 4697 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641083 4697 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641088 4697 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641094 4697 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641099 4697 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641106 4697 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641111 4697 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641115 4697 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641122 4697 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641128 4697 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641134 4697 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641139 4697 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641144 4697 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641149 4697 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641154 4697 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641159 4697 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641164 4697 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641170 4697 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641175 4697 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641181 4697 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641186 4697 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641191 4697 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641198 4697 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641203 4697 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641208 4697 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641213 4697 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641219 4697 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641223 4697 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641228 4697 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641233 4697 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641238 4697 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641243 4697 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641248 4697 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641253 4697 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641258 4697 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641262 4697 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641267 4697 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641272 4697 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641278 4697 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641283 4697 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641288 4697 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641293 4697 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641298 4697 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641304 4697 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641310 4697 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641316 4697 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641323 4697 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641329 4697 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641334 4697 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641340 4697 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.641346 4697 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.642093 4697 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.654072 4697 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.654119 4697 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654207 4697 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654216 4697 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654222 4697 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654228 4697 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654233 4697 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654238 4697 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654244 4697 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654249 4697 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654255 4697 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654261 4697 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654266 4697 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654272 4697 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654277 4697 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654283 4697 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654288 4697 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654292 4697 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654297 4697 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654302 4697 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654307 4697 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654311 4697 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654316 4697 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654321 4697 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654327 4697 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654332 4697 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654337 4697 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654342 4697 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654347 4697 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654352 4697 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654357 4697 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654362 4697 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654367 4697 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654372 4697 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654377 4697 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654382 4697 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654387 4697 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654392 4697 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654397 4697 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654404 4697 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654411 4697 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654419 4697 feature_gate.go:330] unrecognized feature gate: Example Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654425 4697 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654457 4697 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654465 4697 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654471 4697 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654476 4697 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654481 4697 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654486 4697 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654491 4697 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654496 4697 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654501 4697 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654506 4697 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654510 4697 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654517 4697 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654523 4697 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654528 4697 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654533 4697 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654538 4697 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654542 4697 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654548 4697 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654553 4697 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654558 4697 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654563 4697 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654569 4697 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654573 4697 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654580 4697 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654586 4697 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654591 4697 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654597 4697 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654603 4697 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654608 4697 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654614 4697 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.654624 4697 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654762 4697 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654771 4697 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654776 4697 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654781 4697 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654786 4697 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654791 4697 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654796 4697 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654801 4697 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654806 4697 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654811 4697 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654815 4697 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654820 4697 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654825 4697 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654830 4697 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654835 4697 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654839 4697 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654845 4697 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654851 4697 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654858 4697 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654864 4697 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654869 4697 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654875 4697 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654888 4697 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654894 4697 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654899 4697 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654904 4697 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654909 4697 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654913 4697 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654918 4697 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654923 4697 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654928 4697 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654933 4697 feature_gate.go:330] unrecognized feature gate: Example Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654938 4697 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654943 4697 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654948 4697 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654953 4697 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654958 4697 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654962 4697 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654969 4697 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654975 4697 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654980 4697 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654985 4697 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654991 4697 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.654996 4697 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655003 4697 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655010 4697 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655016 4697 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655022 4697 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655028 4697 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655033 4697 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655038 4697 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655043 4697 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655048 4697 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655053 4697 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655058 4697 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655063 4697 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655068 4697 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655073 4697 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655078 4697 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655083 4697 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655087 4697 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655093 4697 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655097 4697 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655102 4697 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655108 4697 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655113 4697 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655118 4697 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655123 4697 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655128 4697 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655133 4697 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.655137 4697 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.655146 4697 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.655368 4697 server.go:940] "Client rotation is on, will bootstrap in background" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.663648 4697 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.663873 4697 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.667664 4697 server.go:997] "Starting client certificate rotation" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.667715 4697 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.668886 4697 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-17 20:00:01.833814722 +0000 UTC Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.669088 4697 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.692847 4697 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.696799 4697 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 16:31:32 crc kubenswrapper[4697]: E0220 16:31:32.697087 4697 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.713784 4697 log.go:25] "Validated CRI v1 runtime API" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.752404 4697 log.go:25] "Validated CRI v1 image API" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.754302 4697 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.759133 4697 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-20-16-26-14-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.759173 4697 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.786484 4697 manager.go:217] Machine: {Timestamp:2026-02-20 16:31:32.783120847 +0000 UTC m=+0.563166325 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:be1cf996-28db-4a15-bd39-36b5992b7e01 BootID:aff1fb75-2d23-4538-af06-66acb56ad245 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:3e:92:42 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:3e:92:42 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:3f:e1:a1 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:70:7b:eb Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:84:9f:7c Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:32:76:df Speed:-1 Mtu:1496} {Name:eth10 MacAddress:b6:00:9f:fb:1b:cd Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3a:99:50:ac:23:41 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.787102 4697 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.787482 4697 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.787936 4697 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.788217 4697 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.788324 4697 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.789887 4697 topology_manager.go:138] "Creating topology manager with none policy" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.789921 4697 container_manager_linux.go:303] "Creating device plugin manager" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.790602 4697 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.790662 4697 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.791013 4697 state_mem.go:36] "Initialized new in-memory state store" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.791180 4697 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.796382 4697 kubelet.go:418] "Attempting to sync node with API server" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.796425 4697 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.796488 4697 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.796513 4697 kubelet.go:324] "Adding apiserver pod source" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.796534 4697 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.801577 4697 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:32 crc kubenswrapper[4697]: E0220 16:31:32.801702 4697 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.801576 4697 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:32 crc kubenswrapper[4697]: E0220 16:31:32.801765 4697 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.802041 4697 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.803314 4697 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.806027 4697 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.807855 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.807911 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.807935 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.807954 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.807984 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.808002 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.808020 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.808049 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.808070 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.808089 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.808116 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.808136 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.809410 4697 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.810425 4697 server.go:1280] "Started kubelet" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.811559 4697 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.811615 4697 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.811565 4697 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.812281 4697 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 20 16:31:32 crc systemd[1]: Started Kubernetes Kubelet. Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.814179 4697 server.go:460] "Adding debug handlers to kubelet server" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.814808 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.814844 4697 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.815084 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 21:56:57.664764303 +0000 UTC Feb 20 16:31:32 crc kubenswrapper[4697]: E0220 16:31:32.815227 4697 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.815419 4697 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.816376 4697 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.815475 4697 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.816164 4697 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:32 crc kubenswrapper[4697]: E0220 16:31:32.816667 4697 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Feb 20 16:31:32 crc kubenswrapper[4697]: E0220 16:31:32.817006 4697 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="200ms" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.819954 4697 factory.go:55] Registering systemd factory Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.820001 4697 factory.go:221] Registration of the systemd container factory successfully Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.820367 4697 factory.go:153] Registering CRI-O factory Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.820558 4697 factory.go:221] Registration of the crio container factory successfully Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.820821 4697 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.821018 4697 factory.go:103] Registering Raw factory Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.821206 4697 manager.go:1196] Started watching for new ooms in manager Feb 20 16:31:32 crc kubenswrapper[4697]: E0220 16:31:32.821530 4697 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189601720f3c0e74 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 16:31:32.810362484 +0000 UTC m=+0.590407932,LastTimestamp:2026-02-20 16:31:32.810362484 +0000 UTC m=+0.590407932,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.823783 4697 manager.go:319] Starting recovery of all containers Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.839898 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.840307 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.840530 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.840726 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.840863 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.840985 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.841106 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.841238 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.841373 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.841605 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.841814 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.841985 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.842155 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.842350 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.842598 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.842788 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.843025 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.843186 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.845613 4697 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.845873 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.846075 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.846239 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.846400 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.846616 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.846812 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.847000 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.847177 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.847462 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.847661 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.847851 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.848062 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.848250 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.848512 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.848717 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.848955 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.849120 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.849251 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.849381 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.849630 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.849806 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.849956 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.850085 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.850205 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.850326 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.850492 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.850637 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.850783 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.850909 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.851035 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.851159 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.851318 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.851502 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.851972 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.852244 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.852423 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.852623 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.852806 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.852962 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.853109 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.853269 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.853512 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.853711 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.853904 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.854148 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.854341 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.854580 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.854800 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.855004 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.855218 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.855465 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.855669 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.855863 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.856024 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.856188 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.856380 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.856605 4697 manager.go:324] Recovery completed Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.856626 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.856992 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.857275 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.857497 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863629 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863679 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863705 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863730 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863753 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863778 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863801 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863823 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863846 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863869 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863894 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863922 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863945 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863974 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.863997 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864022 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864045 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864068 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864091 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864113 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864148 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864181 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864243 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864293 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864354 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864392 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864477 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864542 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864662 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864700 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864740 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864774 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864805 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864835 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864868 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864898 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864929 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864960 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.864992 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865024 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865055 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865086 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865124 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865158 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865193 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865229 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865260 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865292 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865323 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865354 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865385 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865416 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865486 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865519 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865547 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865574 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865606 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865639 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865669 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865701 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865729 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865760 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865790 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865823 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865857 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865887 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.865971 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866032 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866098 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866132 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866165 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866200 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866230 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866274 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866305 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866340 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866383 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866419 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866484 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866516 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866548 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866579 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866610 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866641 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866671 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866713 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866744 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866775 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866807 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866841 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866872 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866904 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866934 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.866971 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867002 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867031 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867063 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867098 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867130 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867196 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867228 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867261 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867293 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867324 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867354 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867387 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867423 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867516 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867547 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867587 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867618 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867650 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867680 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867715 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867748 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867779 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867811 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867844 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867874 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867904 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867935 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867966 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.867997 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.868026 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.868055 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.868091 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.868120 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.868151 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.868180 4697 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.868208 4697 reconstruct.go:97] "Volume reconstruction finished" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.868228 4697 reconciler.go:26] "Reconciler: start to sync state" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.871942 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.873361 4697 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.874010 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.874050 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.874061 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.875023 4697 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.875201 4697 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.875220 4697 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.875242 4697 state_mem.go:36] "Initialized new in-memory state store" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.875760 4697 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.875811 4697 kubelet.go:2335] "Starting kubelet main sync loop" Feb 20 16:31:32 crc kubenswrapper[4697]: E0220 16:31:32.875868 4697 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 20 16:31:32 crc kubenswrapper[4697]: W0220 16:31:32.876962 4697 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:32 crc kubenswrapper[4697]: E0220 16:31:32.877009 4697 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.887872 4697 policy_none.go:49] "None policy: Start" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.888788 4697 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.888825 4697 state_mem.go:35] "Initializing new in-memory state store" Feb 20 16:31:32 crc kubenswrapper[4697]: E0220 16:31:32.916992 4697 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.959263 4697 manager.go:334] "Starting Device Plugin manager" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.959363 4697 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.959387 4697 server.go:79] "Starting device plugin registration server" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.960232 4697 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.960265 4697 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.960865 4697 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.961029 4697 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.961044 4697 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 20 16:31:32 crc kubenswrapper[4697]: E0220 16:31:32.969284 4697 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.976779 4697 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.976889 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.978169 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.978214 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.978231 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.978400 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.979083 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.979144 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.980132 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.980164 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.980183 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.980324 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.983111 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.983181 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.983206 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.983403 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.984151 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.985024 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.985061 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.985105 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.985542 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.985662 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.985691 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.985700 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.986278 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.986350 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.987387 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.987511 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.987536 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.988281 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.988937 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.989127 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.989527 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.989576 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.990384 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.992584 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.992607 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.992618 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.992755 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.992766 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.992775 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.992909 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.992930 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.995709 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.995727 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:32 crc kubenswrapper[4697]: I0220 16:31:32.995736 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:33 crc kubenswrapper[4697]: E0220 16:31:33.017908 4697 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="400ms" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.060481 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.062119 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.062204 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.062228 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.062283 4697 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 16:31:33 crc kubenswrapper[4697]: E0220 16:31:33.063234 4697 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.071875 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.071943 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.071993 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.072028 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.072135 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.072210 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.072309 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.072375 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.072476 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.072538 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.072631 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.072701 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.072741 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.072779 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.072817 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174309 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174378 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174416 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174487 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174524 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174554 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174596 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174643 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174647 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174731 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174769 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174757 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174808 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174814 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174687 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174879 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174884 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.174856 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.175088 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.175139 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.175186 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.175215 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.175245 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.175313 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.175330 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.175225 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.175411 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.175496 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.175552 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.175664 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.264093 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.265672 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.265727 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.265744 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.265781 4697 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 16:31:33 crc kubenswrapper[4697]: E0220 16:31:33.266487 4697 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.313089 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.337135 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.347354 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.354222 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: W0220 16:31:33.361697 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ffe5ed182d67bd055b47311d24286dba30d3a50b35090bafeb9979cf05f92a28 WatchSource:0}: Error finding container ffe5ed182d67bd055b47311d24286dba30d3a50b35090bafeb9979cf05f92a28: Status 404 returned error can't find the container with id ffe5ed182d67bd055b47311d24286dba30d3a50b35090bafeb9979cf05f92a28 Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.371192 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 16:31:33 crc kubenswrapper[4697]: W0220 16:31:33.379299 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b9f3920790edb369f0cc8af3dd9fbe96ca781cb7eda757bf327b153ddb8c6f63 WatchSource:0}: Error finding container b9f3920790edb369f0cc8af3dd9fbe96ca781cb7eda757bf327b153ddb8c6f63: Status 404 returned error can't find the container with id b9f3920790edb369f0cc8af3dd9fbe96ca781cb7eda757bf327b153ddb8c6f63 Feb 20 16:31:33 crc kubenswrapper[4697]: W0220 16:31:33.381686 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0a6354075fd7aa1dcc5e2147cb06ea9698744fedc849791ffdb55d5fbc23cab8 WatchSource:0}: Error finding container 0a6354075fd7aa1dcc5e2147cb06ea9698744fedc849791ffdb55d5fbc23cab8: Status 404 returned error can't find the container with id 0a6354075fd7aa1dcc5e2147cb06ea9698744fedc849791ffdb55d5fbc23cab8 Feb 20 16:31:33 crc kubenswrapper[4697]: E0220 16:31:33.419329 4697 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="800ms" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.666647 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.669145 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.669214 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.669238 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.669288 4697 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 16:31:33 crc kubenswrapper[4697]: E0220 16:31:33.669895 4697 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.812303 4697 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.815467 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 04:21:05.757599292 +0000 UTC Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.883838 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e1127ca9356c81943c55a22ac8d0e72d14ebb7dcd7925554e63b3c2cb0cd575d"} Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.886200 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0a6354075fd7aa1dcc5e2147cb06ea9698744fedc849791ffdb55d5fbc23cab8"} Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.887250 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b9f3920790edb369f0cc8af3dd9fbe96ca781cb7eda757bf327b153ddb8c6f63"} Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.888491 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"40170deb1ae21259c487ae95bcb46284c6773e064a2e0ef2c1504a2bd26567a0"} Feb 20 16:31:33 crc kubenswrapper[4697]: I0220 16:31:33.889265 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ffe5ed182d67bd055b47311d24286dba30d3a50b35090bafeb9979cf05f92a28"} Feb 20 16:31:34 crc kubenswrapper[4697]: W0220 16:31:34.195151 4697 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:34 crc kubenswrapper[4697]: E0220 16:31:34.195359 4697 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Feb 20 16:31:34 crc kubenswrapper[4697]: E0220 16:31:34.220568 4697 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="1.6s" Feb 20 16:31:34 crc kubenswrapper[4697]: W0220 16:31:34.232061 4697 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:34 crc kubenswrapper[4697]: E0220 16:31:34.232135 4697 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Feb 20 16:31:34 crc kubenswrapper[4697]: W0220 16:31:34.338953 4697 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:34 crc kubenswrapper[4697]: E0220 16:31:34.339142 4697 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Feb 20 16:31:34 crc kubenswrapper[4697]: W0220 16:31:34.436189 4697 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:34 crc kubenswrapper[4697]: E0220 16:31:34.436357 4697 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.470691 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.473495 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.473544 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.473559 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.473607 4697 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 16:31:34 crc kubenswrapper[4697]: E0220 16:31:34.474132 4697 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.776817 4697 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 20 16:31:34 crc kubenswrapper[4697]: E0220 16:31:34.779131 4697 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.812773 4697 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.815869 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 05:01:06.636303206 +0000 UTC Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.893802 4697 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd" exitCode=0 Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.893936 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd"} Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.894048 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.895487 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.895547 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.895566 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.896003 4697 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5" exitCode=0 Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.896163 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5"} Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.896384 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.898063 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.898123 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.898143 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.902366 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.903223 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1"} Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.903274 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c"} Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.903287 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0"} Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.903297 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a"} Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.903305 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.903820 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.903856 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.903868 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.904408 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.904481 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.904506 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.907065 4697 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b" exitCode=0 Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.907163 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b"} Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.907394 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.908697 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.908742 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.908781 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.911527 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.911405 4697 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="0c1b2b976d031977f5193be6bb9d09f5648acb271099f04495cfd723e8f6f55f" exitCode=0 Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.911695 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"0c1b2b976d031977f5193be6bb9d09f5648acb271099f04495cfd723e8f6f55f"} Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.912811 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.912842 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:34 crc kubenswrapper[4697]: I0220 16:31:34.912859 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.180824 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.812922 4697 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.816270 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 06:32:29.525422464 +0000 UTC Feb 20 16:31:35 crc kubenswrapper[4697]: E0220 16:31:35.822118 4697 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="3.2s" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.921968 4697 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c" exitCode=0 Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.922077 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c"} Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.922153 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.923561 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.923615 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.923633 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.924750 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"05ef3dc6cc8417e0d55e7e4d1bb011b274fbb1bc1e9d314812205b8366d71c9c"} Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.924796 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.926521 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.926562 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.926584 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.928610 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6b78299504f4a4555d5fa6bd331589ed16effa4428034951de3d8f83ce652780"} Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.928679 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.928663 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1adcc43d1a375bc06218f7d3c94c564132a5f3dd5cde0c7ee1f86883b8100552"} Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.928939 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b9b231279d61bb858c2d662fd41b388d450db0ed9f92b55f968a334a2ce2b50f"} Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.929984 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.930067 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.930098 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.932398 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e"} Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.932478 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.932486 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93"} Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.932520 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7"} Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.932542 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577"} Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.933318 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.933385 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:35 crc kubenswrapper[4697]: I0220 16:31:35.933414 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.075268 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.076490 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.076540 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.076553 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.076589 4697 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 16:31:36 crc kubenswrapper[4697]: E0220 16:31:36.077088 4697 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.085156 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:36 crc kubenswrapper[4697]: W0220 16:31:36.085705 4697 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Feb 20 16:31:36 crc kubenswrapper[4697]: E0220 16:31:36.085802 4697 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.671625 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.677504 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.817480 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 02:45:15.185189148 +0000 UTC Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.938528 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae"} Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.938686 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.940004 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.940088 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.940111 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.940356 4697 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f" exitCode=0 Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.940519 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.940550 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.940751 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f"} Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.940877 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.940903 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.941002 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.941741 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.941794 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.941810 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.943569 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.945828 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.946013 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.946057 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.946096 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.946107 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.946113 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.945960 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:36 crc kubenswrapper[4697]: I0220 16:31:36.946345 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.817602 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 04:11:02.620649507 +0000 UTC Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.947319 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d"} Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.947373 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312"} Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.947389 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc"} Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.947401 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84"} Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.947402 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.947419 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.947503 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.948462 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.948490 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.948499 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.948565 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.948587 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.948595 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.963173 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.963307 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.964314 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.964340 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:37 crc kubenswrapper[4697]: I0220 16:31:37.964368 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:38 crc kubenswrapper[4697]: I0220 16:31:38.517055 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:38 crc kubenswrapper[4697]: I0220 16:31:38.818319 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 19:24:58.550466014 +0000 UTC Feb 20 16:31:38 crc kubenswrapper[4697]: I0220 16:31:38.958169 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142"} Feb 20 16:31:38 crc kubenswrapper[4697]: I0220 16:31:38.958231 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:31:38 crc kubenswrapper[4697]: I0220 16:31:38.958313 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:38 crc kubenswrapper[4697]: I0220 16:31:38.958348 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:38 crc kubenswrapper[4697]: I0220 16:31:38.959771 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:38 crc kubenswrapper[4697]: I0220 16:31:38.959811 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:38 crc kubenswrapper[4697]: I0220 16:31:38.959829 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:38 crc kubenswrapper[4697]: I0220 16:31:38.959902 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:38 crc kubenswrapper[4697]: I0220 16:31:38.959941 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:38 crc kubenswrapper[4697]: I0220 16:31:38.959958 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:39 crc kubenswrapper[4697]: I0220 16:31:39.014019 4697 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 20 16:31:39 crc kubenswrapper[4697]: I0220 16:31:39.117548 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 20 16:31:39 crc kubenswrapper[4697]: I0220 16:31:39.277525 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:39 crc kubenswrapper[4697]: I0220 16:31:39.279138 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:39 crc kubenswrapper[4697]: I0220 16:31:39.279183 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:39 crc kubenswrapper[4697]: I0220 16:31:39.279197 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:39 crc kubenswrapper[4697]: I0220 16:31:39.279228 4697 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 16:31:39 crc kubenswrapper[4697]: I0220 16:31:39.819074 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 23:54:46.325105959 +0000 UTC Feb 20 16:31:39 crc kubenswrapper[4697]: I0220 16:31:39.962089 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:39 crc kubenswrapper[4697]: I0220 16:31:39.969016 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:39 crc kubenswrapper[4697]: I0220 16:31:39.969091 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:39 crc kubenswrapper[4697]: I0220 16:31:39.969120 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:40 crc kubenswrapper[4697]: I0220 16:31:40.745113 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:40 crc kubenswrapper[4697]: I0220 16:31:40.745334 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:40 crc kubenswrapper[4697]: I0220 16:31:40.746516 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:40 crc kubenswrapper[4697]: I0220 16:31:40.746551 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:40 crc kubenswrapper[4697]: I0220 16:31:40.746559 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:40 crc kubenswrapper[4697]: I0220 16:31:40.820050 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 22:44:04.902354568 +0000 UTC Feb 20 16:31:40 crc kubenswrapper[4697]: I0220 16:31:40.965356 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:40 crc kubenswrapper[4697]: I0220 16:31:40.966733 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:40 crc kubenswrapper[4697]: I0220 16:31:40.966778 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:40 crc kubenswrapper[4697]: I0220 16:31:40.966797 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:41 crc kubenswrapper[4697]: I0220 16:31:41.163428 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:41 crc kubenswrapper[4697]: I0220 16:31:41.163756 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:31:41 crc kubenswrapper[4697]: I0220 16:31:41.163845 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:41 crc kubenswrapper[4697]: I0220 16:31:41.165702 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:41 crc kubenswrapper[4697]: I0220 16:31:41.165765 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:41 crc kubenswrapper[4697]: I0220 16:31:41.165787 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:41 crc kubenswrapper[4697]: I0220 16:31:41.820651 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 12:03:51.742088922 +0000 UTC Feb 20 16:31:42 crc kubenswrapper[4697]: I0220 16:31:42.539760 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:42 crc kubenswrapper[4697]: I0220 16:31:42.540050 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:42 crc kubenswrapper[4697]: I0220 16:31:42.541737 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:42 crc kubenswrapper[4697]: I0220 16:31:42.541817 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:42 crc kubenswrapper[4697]: I0220 16:31:42.541844 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:42 crc kubenswrapper[4697]: I0220 16:31:42.820900 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 07:03:19.115928921 +0000 UTC Feb 20 16:31:42 crc kubenswrapper[4697]: E0220 16:31:42.969677 4697 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 20 16:31:43 crc kubenswrapper[4697]: I0220 16:31:43.283536 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 20 16:31:43 crc kubenswrapper[4697]: I0220 16:31:43.283872 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:43 crc kubenswrapper[4697]: I0220 16:31:43.286266 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:43 crc kubenswrapper[4697]: I0220 16:31:43.286353 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:43 crc kubenswrapper[4697]: I0220 16:31:43.286378 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:43 crc kubenswrapper[4697]: I0220 16:31:43.745984 4697 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 16:31:43 crc kubenswrapper[4697]: I0220 16:31:43.746081 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 16:31:43 crc kubenswrapper[4697]: I0220 16:31:43.821918 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 13:45:37.27431639 +0000 UTC Feb 20 16:31:44 crc kubenswrapper[4697]: I0220 16:31:44.822573 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 01:31:39.641482021 +0000 UTC Feb 20 16:31:45 crc kubenswrapper[4697]: I0220 16:31:45.822734 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 22:18:22.135590019 +0000 UTC Feb 20 16:31:46 crc kubenswrapper[4697]: I0220 16:31:46.089730 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:46 crc kubenswrapper[4697]: I0220 16:31:46.089920 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:46 crc kubenswrapper[4697]: I0220 16:31:46.091537 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:46 crc kubenswrapper[4697]: I0220 16:31:46.091594 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:46 crc kubenswrapper[4697]: I0220 16:31:46.091611 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:46 crc kubenswrapper[4697]: W0220 16:31:46.652980 4697 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 20 16:31:46 crc kubenswrapper[4697]: I0220 16:31:46.653102 4697 trace.go:236] Trace[2030249631]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 16:31:36.651) (total time: 10001ms): Feb 20 16:31:46 crc kubenswrapper[4697]: Trace[2030249631]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (16:31:46.652) Feb 20 16:31:46 crc kubenswrapper[4697]: Trace[2030249631]: [10.001392792s] [10.001392792s] END Feb 20 16:31:46 crc kubenswrapper[4697]: E0220 16:31:46.653133 4697 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 20 16:31:46 crc kubenswrapper[4697]: W0220 16:31:46.683961 4697 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 20 16:31:46 crc kubenswrapper[4697]: I0220 16:31:46.684131 4697 trace.go:236] Trace[555574179]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 16:31:36.683) (total time: 10000ms): Feb 20 16:31:46 crc kubenswrapper[4697]: Trace[555574179]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (16:31:46.683) Feb 20 16:31:46 crc kubenswrapper[4697]: Trace[555574179]: [10.000916551s] [10.000916551s] END Feb 20 16:31:46 crc kubenswrapper[4697]: E0220 16:31:46.684173 4697 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 20 16:31:46 crc kubenswrapper[4697]: I0220 16:31:46.718426 4697 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 20 16:31:46 crc kubenswrapper[4697]: I0220 16:31:46.718537 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 20 16:31:46 crc kubenswrapper[4697]: I0220 16:31:46.723488 4697 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 20 16:31:46 crc kubenswrapper[4697]: I0220 16:31:46.723547 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 20 16:31:46 crc kubenswrapper[4697]: I0220 16:31:46.823626 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 05:57:06.433719505 +0000 UTC Feb 20 16:31:47 crc kubenswrapper[4697]: I0220 16:31:47.824284 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 16:54:55.591920279 +0000 UTC Feb 20 16:31:48 crc kubenswrapper[4697]: I0220 16:31:48.825411 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 22:21:39.146621602 +0000 UTC Feb 20 16:31:49 crc kubenswrapper[4697]: I0220 16:31:49.826161 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 11:47:32.18818237 +0000 UTC Feb 20 16:31:50 crc kubenswrapper[4697]: I0220 16:31:50.827014 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 08:37:36.738017999 +0000 UTC Feb 20 16:31:50 crc kubenswrapper[4697]: I0220 16:31:50.962549 4697 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.172743 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.172988 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.174912 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.174976 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.174989 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.180509 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.416578 4697 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.695684 4697 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.696516 4697 trace.go:236] Trace[158314507]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 16:31:36.900) (total time: 14795ms): Feb 20 16:31:51 crc kubenswrapper[4697]: Trace[158314507]: ---"Objects listed" error: 14795ms (16:31:51.696) Feb 20 16:31:51 crc kubenswrapper[4697]: Trace[158314507]: [14.795573445s] [14.795573445s] END Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.696540 4697 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.700247 4697 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.703107 4697 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.714038 4697 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.718124 4697 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.778107 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.784158 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.810132 4697 apiserver.go:52] "Watching apiserver" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.813661 4697 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.814048 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.814567 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.814686 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.814741 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.814816 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.814897 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.815091 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.814895 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.815345 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.815541 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.817799 4697 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.822827 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.822989 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.823006 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.823155 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.823269 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.823541 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.823659 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.824220 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.824231 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.827770 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 03:10:16.189397297 +0000 UTC Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.846184 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.858005 4697 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35538->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.858107 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35538->192.168.126.11:17697: read: connection reset by peer" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.860858 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.870545 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.876970 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.892968 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902435 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902538 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902591 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902624 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902654 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902676 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902698 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902725 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902757 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902780 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902806 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902834 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902911 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902936 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902962 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.902984 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903008 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903034 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903059 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903239 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903269 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903315 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903342 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903366 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903393 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903417 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903446 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903492 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903515 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903538 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903561 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903583 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903607 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903632 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903659 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903682 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903712 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903721 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903758 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903806 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903830 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903861 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.903906 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:31:52.403878977 +0000 UTC m=+20.183924405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903889 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903955 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.903999 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904034 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904069 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904099 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904130 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904158 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904186 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904216 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904243 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904270 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904304 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904336 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904362 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904396 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904422 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904473 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904510 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904536 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904562 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904596 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904601 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904663 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904689 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904711 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904734 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904756 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904781 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904805 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904833 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904860 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904887 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904913 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904939 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904962 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.904984 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905008 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905029 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905051 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905075 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905096 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905118 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905141 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905164 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905184 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905204 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905227 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905250 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905276 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905302 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905323 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905344 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905374 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905398 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905420 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905448 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909743 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909782 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909813 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909839 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909865 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909891 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909919 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909945 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909969 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909997 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910033 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910058 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910081 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910105 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910128 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910152 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910176 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910202 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910227 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910250 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910272 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910295 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910319 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910342 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910367 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910390 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910414 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910443 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910481 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910504 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910526 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910551 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910572 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910596 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910619 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910641 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910667 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910697 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910720 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910745 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910767 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910789 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910811 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910842 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910864 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910888 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910911 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.910934 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.912371 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.912410 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.912439 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.920476 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.920729 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905594 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905666 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905662 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.905916 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.921572 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.921885 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.920767 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.922371 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.923049 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.906072 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.906097 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.906097 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.906168 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.906390 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.906403 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.906446 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.906617 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.906773 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.906804 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.906809 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.907072 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.923843 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.907178 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.907239 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.907328 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.907372 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.907377 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.907530 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.907675 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.907757 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.907816 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.907876 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.908057 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.908082 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.908188 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.908289 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.908332 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.908427 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.908972 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.908994 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909202 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909349 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909846 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.909918 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.911650 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.912401 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.912513 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.913262 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.913899 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.914641 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.915083 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.915118 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.915161 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.915200 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.915865 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.916787 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.917110 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.917183 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.917511 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.918573 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.919750 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.920061 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.920136 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.920180 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.920336 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.920360 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.920367 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.920390 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.920588 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.920646 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.924474 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.924454 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.924482 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.924607 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.924103 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.924924 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.924801 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.924916 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.924957 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925140 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925180 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925214 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925244 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925270 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925273 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925297 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925327 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925351 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925377 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925403 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925427 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925478 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925486 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925504 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925532 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925619 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925648 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925664 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925678 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925706 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925732 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925759 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925783 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.925806 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926028 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926308 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926344 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926370 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926394 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926420 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926473 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926498 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926523 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926553 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926579 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926606 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926656 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926686 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926712 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926738 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926766 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926791 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926816 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926843 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926918 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926956 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.931336 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.932348 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.932516 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.927024 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.932658 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.932725 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.932776 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.933003 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.933047 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.933089 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.933060 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.933119 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.933156 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.933246 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.933329 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.933385 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.933448 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.933840 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.934226 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.926999 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.935538 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.935610 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.935613 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.936237 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.939819 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.940291 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.940518 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.941376 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.942166 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.942189 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.942379 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.942411 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.942505 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.942540 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.943220 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.943227 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.943688 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.944316 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.944823 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.945007 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.945285 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.945736 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.946016 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.947254 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.947507 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.947506 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.947536 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.947599 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.947913 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.948415 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.949344 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.949560 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.950535 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.950941 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.950986 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.951249 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.951137 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.951580 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.951576 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.951647 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.951810 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.951956 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.952078 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.952549 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.952609 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.953046 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.952442 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.953248 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.953476 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.952020 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.953526 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.953624 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.953679 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.953856 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.954325 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.954341 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.954479 4697 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.954584 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.954852 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.954913 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.955521 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:52.455422722 +0000 UTC m=+20.235468140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.955903 4697 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.956077 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.956134 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.956212 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.956346 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.956515 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.957067 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.957579 4697 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.957849 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.957982 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.957914 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.958607 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.958776 4697 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.961586 4697 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.961694 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:52.461667335 +0000 UTC m=+20.241712743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.961762 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.961931 4697 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.961967 4697 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.961987 4697 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962002 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962017 4697 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962034 4697 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962052 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962068 4697 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962083 4697 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962099 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962116 4697 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962131 4697 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962145 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962160 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962174 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962188 4697 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962204 4697 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962218 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962232 4697 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962249 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962264 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962278 4697 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962292 4697 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962306 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962319 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962341 4697 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962355 4697 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962369 4697 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962384 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962398 4697 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962413 4697 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962426 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962443 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962494 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962509 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962522 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962537 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962550 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962566 4697 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962580 4697 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962595 4697 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962609 4697 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962622 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962638 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962651 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962663 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962677 4697 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962690 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962703 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962715 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962729 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962743 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962756 4697 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962772 4697 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962787 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962802 4697 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962814 4697 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962828 4697 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962841 4697 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962854 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962870 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962883 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962897 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962941 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962956 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962970 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962983 4697 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962995 4697 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963012 4697 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963057 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963070 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963084 4697 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963097 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963109 4697 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963122 4697 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963135 4697 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963147 4697 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963159 4697 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963173 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963186 4697 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.961933 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962051 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962071 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962407 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962582 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.962750 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963094 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963159 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963185 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963295 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963308 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963440 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963639 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963661 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963830 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963883 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.963951 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.964171 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.964286 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.964851 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.965096 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.965583 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.965946 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.966141 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.966249 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.968333 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.968358 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.968595 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.971787 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.972802 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.975133 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.975224 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.976654 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.977211 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.977286 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.977509 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.977539 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.977557 4697 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.977631 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:52.477608627 +0000 UTC m=+20.257654045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.982001 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.982066 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.982081 4697 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:51 crc kubenswrapper[4697]: E0220 16:31:51.982166 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:52.482149378 +0000 UTC m=+20.262194796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:51 crc kubenswrapper[4697]: I0220 16:31:51.982678 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.000034 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.000284 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.000541 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.000862 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.001615 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.006959 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.008011 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.008573 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.011338 4697 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae" exitCode=255 Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.012175 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.012792 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae"} Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.014628 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.019927 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.026604 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.027078 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.041634 4697 scope.go:117] "RemoveContainer" containerID="421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.041712 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.045852 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.050584 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.060936 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064235 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064385 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064267 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064467 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064502 4697 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064569 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064584 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064594 4697 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064607 4697 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064619 4697 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064629 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064648 4697 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064658 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064667 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064676 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064686 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064694 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064704 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064714 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064722 4697 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064731 4697 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064741 4697 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064751 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064763 4697 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064776 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064787 4697 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064798 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064808 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064818 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064827 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064837 4697 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064847 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064857 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064867 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064877 4697 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064887 4697 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064897 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064907 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064917 4697 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064928 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064938 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064949 4697 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064958 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064967 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064978 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064988 4697 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.064998 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065008 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065019 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065030 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065041 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065051 4697 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065060 4697 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065070 4697 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065079 4697 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065092 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065102 4697 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065113 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065125 4697 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065135 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065149 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065161 4697 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065174 4697 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065186 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065197 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065206 4697 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065216 4697 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065226 4697 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065237 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065249 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065281 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065292 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065304 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065327 4697 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065337 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065349 4697 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065359 4697 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065369 4697 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065381 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065390 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065413 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065424 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065458 4697 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065471 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065482 4697 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065494 4697 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065508 4697 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065520 4697 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065531 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065554 4697 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065565 4697 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065578 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065596 4697 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065607 4697 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065619 4697 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065632 4697 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065644 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065657 4697 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065675 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065687 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065703 4697 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065715 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065727 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065739 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065751 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065764 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065776 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065788 4697 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065800 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065813 4697 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065825 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065837 4697 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.065861 4697 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.075318 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.087395 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.099623 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.109762 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.119390 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.143161 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.157490 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.165054 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 16:31:52 crc kubenswrapper[4697]: W0220 16:31:52.175097 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-a6524ae5d30895a53b03f808b66eb9ceaee5b24876268dbdd5b9b5c4c67629b5 WatchSource:0}: Error finding container a6524ae5d30895a53b03f808b66eb9ceaee5b24876268dbdd5b9b5c4c67629b5: Status 404 returned error can't find the container with id a6524ae5d30895a53b03f808b66eb9ceaee5b24876268dbdd5b9b5c4c67629b5 Feb 20 16:31:52 crc kubenswrapper[4697]: W0220 16:31:52.195556 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-2f33595c1ea9b36421829608bc4a9c83af6b85ddb614cd4423e5b4b447df6687 WatchSource:0}: Error finding container 2f33595c1ea9b36421829608bc4a9c83af6b85ddb614cd4423e5b4b447df6687: Status 404 returned error can't find the container with id 2f33595c1ea9b36421829608bc4a9c83af6b85ddb614cd4423e5b4b447df6687 Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.468685 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.468858 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.468918 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.468981 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:31:53.468943608 +0000 UTC m=+21.248989016 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.469018 4697 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.469096 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:53.469075771 +0000 UTC m=+21.249121189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.469106 4697 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.469240 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:53.469214214 +0000 UTC m=+21.249259622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.569799 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.570186 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.570257 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.570283 4697 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.570388 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:53.570356937 +0000 UTC m=+21.350402385 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.570535 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.570581 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.570598 4697 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.570683 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:53.570656554 +0000 UTC m=+21.350701962 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.570887 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.829405 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 14:54:21.938427722 +0000 UTC Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.876213 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:31:52 crc kubenswrapper[4697]: E0220 16:31:52.876381 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.882843 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.883625 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.885200 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.886156 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.887557 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.888208 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.888990 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.890287 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.891173 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.892393 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.893134 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.894599 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.895266 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.895958 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.897182 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.896951 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:52Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.897967 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.899254 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.899872 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.900783 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.902092 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.902702 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.903968 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.904658 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.906011 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.906667 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.907526 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.909194 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.909756 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.910334 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.910890 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.911371 4697 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.911500 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.912921 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.913501 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.914021 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.915389 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.916183 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.916805 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.918778 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:52Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.924983 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.926015 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.926957 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.927676 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.928938 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.929744 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.930954 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.931652 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.932730 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.933666 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.934765 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.935334 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.935921 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.937168 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.937863 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.938978 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.964263 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:52Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:52 crc kubenswrapper[4697]: I0220 16:31:52.991254 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:52Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.007725 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.016224 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67"} Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.016278 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c"} Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.016294 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2f33595c1ea9b36421829608bc4a9c83af6b85ddb614cd4423e5b4b447df6687"} Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.018285 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c"} Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.018314 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a6524ae5d30895a53b03f808b66eb9ceaee5b24876268dbdd5b9b5c4c67629b5"} Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.019783 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"cd8058c2eb05b4be564b938fb1c08710183e763526348d40c1efde2309382e48"} Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.022239 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.024135 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91"} Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.024630 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.028139 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.045587 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.061164 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.087916 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.102821 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.123326 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.140779 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.156558 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.169627 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.186263 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.205525 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.313878 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.330094 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.330343 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.331442 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.343826 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.361219 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.374378 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.395292 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.416396 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.431757 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.448994 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.464187 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.479695 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.481109 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.481302 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.481344 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:31:55.481304207 +0000 UTC m=+23.261349825 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.481473 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.481493 4697 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.481597 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:55.481568904 +0000 UTC m=+23.261614352 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.481677 4697 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.481738 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:55.481726558 +0000 UTC m=+23.261772226 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.499906 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.525406 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.568026 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.582809 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.582891 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.583050 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.583094 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.583109 4697 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.583108 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.583143 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.583166 4697 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.583182 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:55.583160478 +0000 UTC m=+23.363205886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.583257 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:55.583228729 +0000 UTC m=+23.363274187 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.598196 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.614104 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.627757 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.653824 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.829829 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 01:53:14.180924367 +0000 UTC Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.876594 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:31:53 crc kubenswrapper[4697]: I0220 16:31:53.876648 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.876754 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:31:53 crc kubenswrapper[4697]: E0220 16:31:53.876894 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:31:54 crc kubenswrapper[4697]: I0220 16:31:54.830069 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 17:47:56.820280995 +0000 UTC Feb 20 16:31:54 crc kubenswrapper[4697]: I0220 16:31:54.877208 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:31:54 crc kubenswrapper[4697]: E0220 16:31:54.877500 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:31:55 crc kubenswrapper[4697]: I0220 16:31:55.508841 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.509035 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:31:59.509006802 +0000 UTC m=+27.289052210 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:31:55 crc kubenswrapper[4697]: I0220 16:31:55.509273 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:55 crc kubenswrapper[4697]: I0220 16:31:55.509322 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.509407 4697 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.509503 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:59.509483174 +0000 UTC m=+27.289528582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.509581 4697 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.509613 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:59.509603897 +0000 UTC m=+27.289649385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:31:55 crc kubenswrapper[4697]: I0220 16:31:55.561985 4697 csr.go:261] certificate signing request csr-p9qvn is approved, waiting to be issued Feb 20 16:31:55 crc kubenswrapper[4697]: I0220 16:31:55.600054 4697 csr.go:257] certificate signing request csr-p9qvn is issued Feb 20 16:31:55 crc kubenswrapper[4697]: I0220 16:31:55.610383 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:31:55 crc kubenswrapper[4697]: I0220 16:31:55.610621 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.610639 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.610796 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.610858 4697 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.610692 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.610984 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.611001 4697 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.611092 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:59.610953154 +0000 UTC m=+27.390998562 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.611194 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 16:31:59.61118232 +0000 UTC m=+27.391227728 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:55 crc kubenswrapper[4697]: I0220 16:31:55.830475 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 08:58:29.009892592 +0000 UTC Feb 20 16:31:55 crc kubenswrapper[4697]: I0220 16:31:55.876857 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:55 crc kubenswrapper[4697]: I0220 16:31:55.876904 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.876977 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:31:55 crc kubenswrapper[4697]: E0220 16:31:55.877322 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.001001 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-nvnfb"] Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.001302 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nvnfb" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.003666 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.006314 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.020408 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.023727 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.033181 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad"} Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.037127 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.053971 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.069304 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.081578 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.097543 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.109256 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.113940 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4e96c5eb-134a-4c03-9899-8f97a9aba0b9-hosts-file\") pod \"node-resolver-nvnfb\" (UID: \"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\") " pod="openshift-dns/node-resolver-nvnfb" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.114024 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6btnr\" (UniqueName: \"kubernetes.io/projected/4e96c5eb-134a-4c03-9899-8f97a9aba0b9-kube-api-access-6btnr\") pod \"node-resolver-nvnfb\" (UID: \"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\") " pod="openshift-dns/node-resolver-nvnfb" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.121412 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.140015 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.153533 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.168237 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.215063 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4e96c5eb-134a-4c03-9899-8f97a9aba0b9-hosts-file\") pod \"node-resolver-nvnfb\" (UID: \"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\") " pod="openshift-dns/node-resolver-nvnfb" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.215116 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6btnr\" (UniqueName: \"kubernetes.io/projected/4e96c5eb-134a-4c03-9899-8f97a9aba0b9-kube-api-access-6btnr\") pod \"node-resolver-nvnfb\" (UID: \"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\") " pod="openshift-dns/node-resolver-nvnfb" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.215353 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4e96c5eb-134a-4c03-9899-8f97a9aba0b9-hosts-file\") pod \"node-resolver-nvnfb\" (UID: \"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\") " pod="openshift-dns/node-resolver-nvnfb" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.218515 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.241921 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.241987 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6btnr\" (UniqueName: \"kubernetes.io/projected/4e96c5eb-134a-4c03-9899-8f97a9aba0b9-kube-api-access-6btnr\") pod \"node-resolver-nvnfb\" (UID: \"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\") " pod="openshift-dns/node-resolver-nvnfb" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.256123 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.286535 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.310172 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.320418 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nvnfb" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.342668 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.358523 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.373232 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.397307 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-lrpxf"] Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.397764 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.397939 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-vtsdj"] Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.398655 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-bgvrc"] Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.399275 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9zpdc"] Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.399489 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.399539 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.399895 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.406065 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.406184 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.406520 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.406630 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.409889 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.410189 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.410303 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.410649 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.411307 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.411366 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.412457 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.416475 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.416791 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.416859 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.416974 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.417243 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.416829 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.417579 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.417737 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.429931 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.465916 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.486656 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.501361 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519257 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-cni-netd\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519292 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519325 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-env-overrides\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519346 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-hostroot\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519365 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f571603b-6223-4f16-b5fa-019ef7c4abb6-cnibin\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519492 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br8r5\" (UniqueName: \"kubernetes.io/projected/f571603b-6223-4f16-b5fa-019ef7c4abb6-kube-api-access-br8r5\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519596 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-956hf\" (UniqueName: \"kubernetes.io/projected/1de5dc4e-ef42-48fc-be23-eaec2039c031-kube-api-access-956hf\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519619 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ba970a98-5bee-40d6-ade6-6dcbed87b581-rootfs\") pod \"machine-config-daemon-bgvrc\" (UID: \"ba970a98-5bee-40d6-ade6-6dcbed87b581\") " pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519650 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-var-lib-openvswitch\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519672 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99eb233c-7094-4a86-ab37-0b160001bbef-ovn-node-metrics-cert\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519688 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-ovnkube-script-lib\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519708 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-etc-kubernetes\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519727 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f571603b-6223-4f16-b5fa-019ef7c4abb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519745 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba970a98-5bee-40d6-ade6-6dcbed87b581-proxy-tls\") pod \"machine-config-daemon-bgvrc\" (UID: \"ba970a98-5bee-40d6-ade6-6dcbed87b581\") " pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519847 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-etc-openvswitch\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519891 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-openvswitch\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519909 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1de5dc4e-ef42-48fc-be23-eaec2039c031-cni-binary-copy\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519929 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6tc5\" (UniqueName: \"kubernetes.io/projected/ba970a98-5bee-40d6-ade6-6dcbed87b581-kube-api-access-q6tc5\") pod \"machine-config-daemon-bgvrc\" (UID: \"ba970a98-5bee-40d6-ade6-6dcbed87b581\") " pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519952 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1de5dc4e-ef42-48fc-be23-eaec2039c031-multus-daemon-config\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519968 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f571603b-6223-4f16-b5fa-019ef7c4abb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.519985 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba970a98-5bee-40d6-ade6-6dcbed87b581-mcd-auth-proxy-config\") pod \"machine-config-daemon-bgvrc\" (UID: \"ba970a98-5bee-40d6-ade6-6dcbed87b581\") " pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520037 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-kubelet\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520092 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-run-netns\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520168 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-ovnkube-config\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520192 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5wwd\" (UniqueName: \"kubernetes.io/projected/99eb233c-7094-4a86-ab37-0b160001bbef-kube-api-access-z5wwd\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520212 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-multus-cni-dir\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520251 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-run-multus-certs\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520299 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-systemd-units\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520321 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-cnibin\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520362 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-cni-bin\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520379 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-system-cni-dir\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520402 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-run-k8s-cni-cncf-io\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520422 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-run-netns\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520453 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-log-socket\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520469 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-run-ovn-kubernetes\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520505 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-slash\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520527 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-node-log\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520548 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-var-lib-cni-bin\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520564 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f571603b-6223-4f16-b5fa-019ef7c4abb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520582 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f571603b-6223-4f16-b5fa-019ef7c4abb6-system-cni-dir\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520605 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f571603b-6223-4f16-b5fa-019ef7c4abb6-os-release\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520626 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-var-lib-kubelet\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520660 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-ovn\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520682 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-os-release\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520704 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-var-lib-cni-multus\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520728 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-systemd\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520756 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-multus-socket-dir-parent\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.520804 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-multus-conf-dir\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.521126 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.536007 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.549871 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.564769 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.576781 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.587382 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.601610 4697 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-20 16:26:55 +0000 UTC, rotation deadline is 2026-12-09 08:42:46.617719779 +0000 UTC Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.601681 4697 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7000h10m50.016042955s for next certificate rotation Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.603268 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.620649 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.621984 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-var-lib-cni-bin\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622030 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f571603b-6223-4f16-b5fa-019ef7c4abb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622053 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f571603b-6223-4f16-b5fa-019ef7c4abb6-system-cni-dir\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622076 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f571603b-6223-4f16-b5fa-019ef7c4abb6-os-release\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622116 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-ovn\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622141 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-os-release\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622130 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-var-lib-cni-bin\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622162 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-var-lib-cni-multus\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622228 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-var-lib-kubelet\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622286 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f571603b-6223-4f16-b5fa-019ef7c4abb6-system-cni-dir\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622294 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-ovn\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622346 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-var-lib-cni-multus\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622315 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-var-lib-kubelet\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622254 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-systemd\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622420 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-multus-socket-dir-parent\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622467 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-multus-conf-dir\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622524 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-multus-socket-dir-parent\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622542 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-cni-netd\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622575 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-cni-netd\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622485 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-systemd\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622603 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622659 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622690 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f571603b-6223-4f16-b5fa-019ef7c4abb6-os-release\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622695 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-os-release\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622510 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-multus-conf-dir\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622733 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-env-overrides\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622796 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-hostroot\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622832 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f571603b-6223-4f16-b5fa-019ef7c4abb6-cnibin\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622859 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br8r5\" (UniqueName: \"kubernetes.io/projected/f571603b-6223-4f16-b5fa-019ef7c4abb6-kube-api-access-br8r5\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622880 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f571603b-6223-4f16-b5fa-019ef7c4abb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622887 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ba970a98-5bee-40d6-ade6-6dcbed87b581-rootfs\") pod \"machine-config-daemon-bgvrc\" (UID: \"ba970a98-5bee-40d6-ade6-6dcbed87b581\") " pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622917 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-var-lib-openvswitch\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622924 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f571603b-6223-4f16-b5fa-019ef7c4abb6-cnibin\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622943 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99eb233c-7094-4a86-ab37-0b160001bbef-ovn-node-metrics-cert\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.622975 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-hostroot\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.623017 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ba970a98-5bee-40d6-ade6-6dcbed87b581-rootfs\") pod \"machine-config-daemon-bgvrc\" (UID: \"ba970a98-5bee-40d6-ade6-6dcbed87b581\") " pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.623022 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-var-lib-openvswitch\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.623221 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-ovnkube-script-lib\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.623282 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-956hf\" (UniqueName: \"kubernetes.io/projected/1de5dc4e-ef42-48fc-be23-eaec2039c031-kube-api-access-956hf\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.623337 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f571603b-6223-4f16-b5fa-019ef7c4abb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.623674 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-env-overrides\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.623932 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-ovnkube-script-lib\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.623969 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba970a98-5bee-40d6-ade6-6dcbed87b581-proxy-tls\") pod \"machine-config-daemon-bgvrc\" (UID: \"ba970a98-5bee-40d6-ade6-6dcbed87b581\") " pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624053 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-etc-openvswitch\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624099 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-etc-openvswitch\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624116 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-openvswitch\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624145 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1de5dc4e-ef42-48fc-be23-eaec2039c031-cni-binary-copy\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624170 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-etc-kubernetes\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624186 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-openvswitch\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624197 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6tc5\" (UniqueName: \"kubernetes.io/projected/ba970a98-5bee-40d6-ade6-6dcbed87b581-kube-api-access-q6tc5\") pod \"machine-config-daemon-bgvrc\" (UID: \"ba970a98-5bee-40d6-ade6-6dcbed87b581\") " pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624230 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1de5dc4e-ef42-48fc-be23-eaec2039c031-multus-daemon-config\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624257 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f571603b-6223-4f16-b5fa-019ef7c4abb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624232 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-etc-kubernetes\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624281 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f571603b-6223-4f16-b5fa-019ef7c4abb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624292 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba970a98-5bee-40d6-ade6-6dcbed87b581-mcd-auth-proxy-config\") pod \"machine-config-daemon-bgvrc\" (UID: \"ba970a98-5bee-40d6-ade6-6dcbed87b581\") " pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624352 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-kubelet\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624378 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-run-netns\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624407 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-ovnkube-config\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624437 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5wwd\" (UniqueName: \"kubernetes.io/projected/99eb233c-7094-4a86-ab37-0b160001bbef-kube-api-access-z5wwd\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624481 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-kubelet\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624509 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-run-netns\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624517 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-multus-cni-dir\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624652 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-run-multus-certs\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624676 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-systemd-units\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624695 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-cnibin\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624712 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-cni-bin\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624722 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-multus-cni-dir\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624728 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-system-cni-dir\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624771 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-run-k8s-cni-cncf-io\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624797 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-run-netns\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624824 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-log-socket\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624844 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-run-multus-certs\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624849 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-run-ovn-kubernetes\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624871 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-systemd-units\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624892 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-slash\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624903 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-cnibin\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624917 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-node-log\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624923 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-cni-bin\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624826 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-system-cni-dir\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624980 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-log-socket\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.624989 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-run-ovn-kubernetes\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.625004 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-slash\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.625016 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1de5dc4e-ef42-48fc-be23-eaec2039c031-host-run-k8s-cni-cncf-io\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.625045 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-node-log\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.625050 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-run-netns\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.625131 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-ovnkube-config\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.625195 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba970a98-5bee-40d6-ade6-6dcbed87b581-mcd-auth-proxy-config\") pod \"machine-config-daemon-bgvrc\" (UID: \"ba970a98-5bee-40d6-ade6-6dcbed87b581\") " pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.625261 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f571603b-6223-4f16-b5fa-019ef7c4abb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.625556 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1de5dc4e-ef42-48fc-be23-eaec2039c031-multus-daemon-config\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.625687 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1de5dc4e-ef42-48fc-be23-eaec2039c031-cni-binary-copy\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.628993 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99eb233c-7094-4a86-ab37-0b160001bbef-ovn-node-metrics-cert\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.629001 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba970a98-5bee-40d6-ade6-6dcbed87b581-proxy-tls\") pod \"machine-config-daemon-bgvrc\" (UID: \"ba970a98-5bee-40d6-ade6-6dcbed87b581\") " pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.637249 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.640188 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br8r5\" (UniqueName: \"kubernetes.io/projected/f571603b-6223-4f16-b5fa-019ef7c4abb6-kube-api-access-br8r5\") pod \"multus-additional-cni-plugins-vtsdj\" (UID: \"f571603b-6223-4f16-b5fa-019ef7c4abb6\") " pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.642490 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6tc5\" (UniqueName: \"kubernetes.io/projected/ba970a98-5bee-40d6-ade6-6dcbed87b581-kube-api-access-q6tc5\") pod \"machine-config-daemon-bgvrc\" (UID: \"ba970a98-5bee-40d6-ade6-6dcbed87b581\") " pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.644796 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-956hf\" (UniqueName: \"kubernetes.io/projected/1de5dc4e-ef42-48fc-be23-eaec2039c031-kube-api-access-956hf\") pod \"multus-lrpxf\" (UID: \"1de5dc4e-ef42-48fc-be23-eaec2039c031\") " pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.644975 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5wwd\" (UniqueName: \"kubernetes.io/projected/99eb233c-7094-4a86-ab37-0b160001bbef-kube-api-access-z5wwd\") pod \"ovnkube-node-9zpdc\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.651764 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.670747 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.724425 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lrpxf" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.731868 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.740052 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.749289 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.830793 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 08:46:40.728883838 +0000 UTC Feb 20 16:31:56 crc kubenswrapper[4697]: I0220 16:31:56.876708 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:31:56 crc kubenswrapper[4697]: E0220 16:31:56.876844 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.039217 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c" exitCode=0 Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.039298 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c"} Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.039380 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerStarted","Data":"e089f465ff1499c7b6927f2a1d6395273892a6e3ad36ce8abdb8d491f59a5da1"} Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.043769 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6"} Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.043844 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"c6dcec5975e84e9c91d970b5300f02ddcb5ded81466891fa3f4ae36228289761"} Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.046482 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lrpxf" event={"ID":"1de5dc4e-ef42-48fc-be23-eaec2039c031","Type":"ContainerStarted","Data":"d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280"} Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.046544 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lrpxf" event={"ID":"1de5dc4e-ef42-48fc-be23-eaec2039c031","Type":"ContainerStarted","Data":"33363ad41ef4b8fbef5d746c069244efa377a9f7214e471ec492b907737b31d1"} Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.047742 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" event={"ID":"f571603b-6223-4f16-b5fa-019ef7c4abb6","Type":"ContainerStarted","Data":"ab98c8a31cf07a4ab012e015e1e39c35239a24ef41b56b90016c3d234ce68914"} Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.048931 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nvnfb" event={"ID":"4e96c5eb-134a-4c03-9899-8f97a9aba0b9","Type":"ContainerStarted","Data":"95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0"} Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.048967 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nvnfb" event={"ID":"4e96c5eb-134a-4c03-9899-8f97a9aba0b9","Type":"ContainerStarted","Data":"10499bb38ba24ce7fffd58a2b478a9387c5624ce28f370d645d9be0853161cfb"} Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.057557 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.071783 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.091387 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.106844 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.121985 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.136466 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.149645 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.182037 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.205580 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.221653 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.241681 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.257302 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.270663 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.288171 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.302972 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.315952 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.331514 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.345646 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.363162 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.381701 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.393115 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.410843 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.444502 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.464165 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.479725 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.492707 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.512608 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.531576 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.831604 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 00:55:42.148296197 +0000 UTC Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.876247 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:31:57 crc kubenswrapper[4697]: I0220 16:31:57.876284 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:57 crc kubenswrapper[4697]: E0220 16:31:57.876374 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:31:57 crc kubenswrapper[4697]: E0220 16:31:57.876541 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.059925 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerStarted","Data":"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.060405 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerStarted","Data":"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.060420 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerStarted","Data":"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.060466 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerStarted","Data":"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.060480 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerStarted","Data":"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.060490 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerStarted","Data":"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.062501 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.064789 4697 generic.go:334] "Generic (PLEG): container finished" podID="f571603b-6223-4f16-b5fa-019ef7c4abb6" containerID="9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4" exitCode=0 Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.064854 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" event={"ID":"f571603b-6223-4f16-b5fa-019ef7c4abb6","Type":"ContainerDied","Data":"9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.079336 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.092802 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.103890 4697 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.106930 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.106962 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.106972 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.107109 4697 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.110406 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.114819 4697 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.114890 4697 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.115780 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.115811 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.115820 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.115835 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.115846 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.129613 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: E0220 16:31:58.143384 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.144512 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.147126 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.147159 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.147170 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.147189 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.147201 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:58 crc kubenswrapper[4697]: E0220 16:31:58.160462 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.161041 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.165419 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.165490 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.165504 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.165534 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.165546 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:58 crc kubenswrapper[4697]: E0220 16:31:58.178163 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.179019 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.182482 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.182509 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.182517 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.182532 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.182545 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:58 crc kubenswrapper[4697]: E0220 16:31:58.199859 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.202583 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.205004 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.205073 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.205087 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.205111 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.205125 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.217353 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: E0220 16:31:58.217406 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: E0220 16:31:58.217827 4697 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.219881 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.219923 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.219934 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.219954 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.219968 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.229757 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.249909 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.262131 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.275885 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.290953 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.309857 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.323477 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.323514 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.323524 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.323546 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.323558 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.324162 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.338770 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.352295 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.364821 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.381693 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.393582 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.406740 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.426145 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.426990 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.427048 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.427065 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.427089 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.427105 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.447517 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.459967 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.472104 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.488008 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.505553 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:58Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.531227 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.531281 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.531293 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.531319 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.531334 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.634350 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.634409 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.634421 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.634459 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.634474 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.737211 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.737283 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.737300 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.737330 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.737347 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.832279 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 23:51:01.73532162 +0000 UTC Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.840511 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.840548 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.840561 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.840577 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.840587 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.876804 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:31:58 crc kubenswrapper[4697]: E0220 16:31:58.876931 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.943928 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.943976 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.943986 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.944007 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:58 crc kubenswrapper[4697]: I0220 16:31:58.944019 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:58Z","lastTransitionTime":"2026-02-20T16:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.038926 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jngbq"] Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.039323 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jngbq" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.041304 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.041411 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.041500 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.042389 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.047009 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.047052 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.047071 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.047097 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.047115 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:59Z","lastTransitionTime":"2026-02-20T16:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.057195 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.075996 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.079067 4697 generic.go:334] "Generic (PLEG): container finished" podID="f571603b-6223-4f16-b5fa-019ef7c4abb6" containerID="4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5" exitCode=0 Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.079376 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" event={"ID":"f571603b-6223-4f16-b5fa-019ef7c4abb6","Type":"ContainerDied","Data":"4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5"} Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.090704 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.110342 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.129398 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.148562 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/502233fe-4219-44a5-9ddb-66eae7401369-host\") pod \"node-ca-jngbq\" (UID: \"502233fe-4219-44a5-9ddb-66eae7401369\") " pod="openshift-image-registry/node-ca-jngbq" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.148921 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s484d\" (UniqueName: \"kubernetes.io/projected/502233fe-4219-44a5-9ddb-66eae7401369-kube-api-access-s484d\") pod \"node-ca-jngbq\" (UID: \"502233fe-4219-44a5-9ddb-66eae7401369\") " pod="openshift-image-registry/node-ca-jngbq" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.149041 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/502233fe-4219-44a5-9ddb-66eae7401369-serviceca\") pod \"node-ca-jngbq\" (UID: \"502233fe-4219-44a5-9ddb-66eae7401369\") " pod="openshift-image-registry/node-ca-jngbq" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.150601 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.150652 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.150665 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.150688 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.150702 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:59Z","lastTransitionTime":"2026-02-20T16:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.152200 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.168683 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.182318 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.199897 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.213722 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.228657 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.243660 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.250039 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/502233fe-4219-44a5-9ddb-66eae7401369-host\") pod \"node-ca-jngbq\" (UID: \"502233fe-4219-44a5-9ddb-66eae7401369\") " pod="openshift-image-registry/node-ca-jngbq" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.250641 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s484d\" (UniqueName: \"kubernetes.io/projected/502233fe-4219-44a5-9ddb-66eae7401369-kube-api-access-s484d\") pod \"node-ca-jngbq\" (UID: \"502233fe-4219-44a5-9ddb-66eae7401369\") " pod="openshift-image-registry/node-ca-jngbq" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.250846 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/502233fe-4219-44a5-9ddb-66eae7401369-serviceca\") pod \"node-ca-jngbq\" (UID: \"502233fe-4219-44a5-9ddb-66eae7401369\") " pod="openshift-image-registry/node-ca-jngbq" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.250382 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/502233fe-4219-44a5-9ddb-66eae7401369-host\") pod \"node-ca-jngbq\" (UID: \"502233fe-4219-44a5-9ddb-66eae7401369\") " pod="openshift-image-registry/node-ca-jngbq" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.253277 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/502233fe-4219-44a5-9ddb-66eae7401369-serviceca\") pod \"node-ca-jngbq\" (UID: \"502233fe-4219-44a5-9ddb-66eae7401369\") " pod="openshift-image-registry/node-ca-jngbq" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.254067 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.254119 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.254129 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.254152 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.254162 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:59Z","lastTransitionTime":"2026-02-20T16:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.255996 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.269166 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.276032 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s484d\" (UniqueName: \"kubernetes.io/projected/502233fe-4219-44a5-9ddb-66eae7401369-kube-api-access-s484d\") pod \"node-ca-jngbq\" (UID: \"502233fe-4219-44a5-9ddb-66eae7401369\") " pod="openshift-image-registry/node-ca-jngbq" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.283170 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.297920 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.314090 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.327100 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.342399 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.351163 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jngbq" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.357269 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.357315 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.357324 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.357345 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.357360 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:59Z","lastTransitionTime":"2026-02-20T16:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.360921 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: W0220 16:31:59.368206 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod502233fe_4219_44a5_9ddb_66eae7401369.slice/crio-343d7f12a5ca5271ee8db92b36f51ceef422b0e330cf976a78c672f3a134d7a4 WatchSource:0}: Error finding container 343d7f12a5ca5271ee8db92b36f51ceef422b0e330cf976a78c672f3a134d7a4: Status 404 returned error can't find the container with id 343d7f12a5ca5271ee8db92b36f51ceef422b0e330cf976a78c672f3a134d7a4 Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.378387 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.398568 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.425198 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.443207 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.459600 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.460910 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.460969 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.460980 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.460995 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.461005 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:59Z","lastTransitionTime":"2026-02-20T16:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.474065 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.485958 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.505788 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.518844 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.531910 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:31:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.553509 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.553663 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:32:07.553639146 +0000 UTC m=+35.333684554 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.553727 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.553799 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.553902 4697 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.553935 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:32:07.553928023 +0000 UTC m=+35.333973431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.553934 4697 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.554025 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:32:07.554006655 +0000 UTC m=+35.334052073 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.564685 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.564727 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.564735 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.564752 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.564763 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:59Z","lastTransitionTime":"2026-02-20T16:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.655082 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.655136 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.655253 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.655268 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.655279 4697 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.655308 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.655360 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.655382 4697 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.655326 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 16:32:07.655312592 +0000 UTC m=+35.435358000 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.655541 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 16:32:07.655513576 +0000 UTC m=+35.435559024 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.667372 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.667430 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.667483 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.667509 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.667528 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:59Z","lastTransitionTime":"2026-02-20T16:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.770940 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.770980 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.770991 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.771008 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.771017 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:59Z","lastTransitionTime":"2026-02-20T16:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.832861 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 14:31:37.322364125 +0000 UTC Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.873391 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.873424 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.873435 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.873471 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.873480 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:59Z","lastTransitionTime":"2026-02-20T16:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.876150 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.876235 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.876266 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:31:59 crc kubenswrapper[4697]: E0220 16:31:59.876304 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.977271 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.977674 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.977685 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.977703 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:31:59 crc kubenswrapper[4697]: I0220 16:31:59.977719 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:31:59Z","lastTransitionTime":"2026-02-20T16:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.081212 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.081248 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.081259 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.081274 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.081287 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:00Z","lastTransitionTime":"2026-02-20T16:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.089390 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerStarted","Data":"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.092722 4697 generic.go:334] "Generic (PLEG): container finished" podID="f571603b-6223-4f16-b5fa-019ef7c4abb6" containerID="7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5" exitCode=0 Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.092772 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" event={"ID":"f571603b-6223-4f16-b5fa-019ef7c4abb6","Type":"ContainerDied","Data":"7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.095536 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jngbq" event={"ID":"502233fe-4219-44a5-9ddb-66eae7401369","Type":"ContainerStarted","Data":"f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.095569 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jngbq" event={"ID":"502233fe-4219-44a5-9ddb-66eae7401369","Type":"ContainerStarted","Data":"343d7f12a5ca5271ee8db92b36f51ceef422b0e330cf976a78c672f3a134d7a4"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.112581 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.152166 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.184709 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.185356 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.185412 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.185431 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.185499 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.185520 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:00Z","lastTransitionTime":"2026-02-20T16:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.201170 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.225139 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.246217 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.265337 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.280359 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.290212 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.290250 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.290259 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.290274 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.290284 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:00Z","lastTransitionTime":"2026-02-20T16:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.300611 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.317743 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.337590 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.357170 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.375364 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.389631 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.395800 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.395826 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.395834 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.395848 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.395856 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:00Z","lastTransitionTime":"2026-02-20T16:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.399244 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.411115 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.425063 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.436065 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.451139 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.468356 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.481721 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.494364 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.503115 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.503161 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.503174 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.503192 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.503205 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:00Z","lastTransitionTime":"2026-02-20T16:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.512033 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.525289 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.546188 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.561611 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.572151 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.605175 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.605210 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.605287 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.605341 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.605352 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:00Z","lastTransitionTime":"2026-02-20T16:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.611801 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.656684 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.692137 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.708387 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.708544 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.708620 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.708688 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.708762 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:00Z","lastTransitionTime":"2026-02-20T16:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.812735 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.812984 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.813104 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.813202 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.813292 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:00Z","lastTransitionTime":"2026-02-20T16:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.833088 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 09:50:23.24080008 +0000 UTC Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.877026 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:00 crc kubenswrapper[4697]: E0220 16:32:00.877133 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.915736 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.915786 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.915811 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.915860 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:00 crc kubenswrapper[4697]: I0220 16:32:00.915872 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:00Z","lastTransitionTime":"2026-02-20T16:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.020041 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.020088 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.020100 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.020120 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.020145 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:01Z","lastTransitionTime":"2026-02-20T16:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.105506 4697 generic.go:334] "Generic (PLEG): container finished" podID="f571603b-6223-4f16-b5fa-019ef7c4abb6" containerID="01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd" exitCode=0 Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.105559 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" event={"ID":"f571603b-6223-4f16-b5fa-019ef7c4abb6","Type":"ContainerDied","Data":"01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd"} Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.123960 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.124041 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.124066 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.124099 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.124127 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:01Z","lastTransitionTime":"2026-02-20T16:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.131761 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.146080 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.174787 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.193955 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.209206 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.223387 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.226749 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.226787 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.226796 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.226812 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.226823 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:01Z","lastTransitionTime":"2026-02-20T16:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.238581 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.249887 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.291739 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.312096 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.329495 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.329521 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.329530 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.329562 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.329572 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:01Z","lastTransitionTime":"2026-02-20T16:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.336398 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.352228 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.370201 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.384013 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.398215 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:01Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.432921 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.432974 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.433003 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.433027 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.433042 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:01Z","lastTransitionTime":"2026-02-20T16:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.535340 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.535380 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.535390 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.535407 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.535418 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:01Z","lastTransitionTime":"2026-02-20T16:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.638634 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.638684 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.638694 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.638712 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.638724 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:01Z","lastTransitionTime":"2026-02-20T16:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.741850 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.741909 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.741924 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.741946 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.741962 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:01Z","lastTransitionTime":"2026-02-20T16:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.834164 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 07:45:32.737648759 +0000 UTC Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.843962 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.844005 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.844017 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.844036 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.844047 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:01Z","lastTransitionTime":"2026-02-20T16:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.876544 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.876624 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:01 crc kubenswrapper[4697]: E0220 16:32:01.876692 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:01 crc kubenswrapper[4697]: E0220 16:32:01.876753 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.946630 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.946684 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.946699 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.946722 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:01 crc kubenswrapper[4697]: I0220 16:32:01.946740 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:01Z","lastTransitionTime":"2026-02-20T16:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.049256 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.049315 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.049332 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.049355 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.049373 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:02Z","lastTransitionTime":"2026-02-20T16:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.114125 4697 generic.go:334] "Generic (PLEG): container finished" podID="f571603b-6223-4f16-b5fa-019ef7c4abb6" containerID="246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8" exitCode=0 Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.114190 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" event={"ID":"f571603b-6223-4f16-b5fa-019ef7c4abb6","Type":"ContainerDied","Data":"246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8"} Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.138246 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.152167 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.152224 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.152241 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.152296 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.152313 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:02Z","lastTransitionTime":"2026-02-20T16:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.160126 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.186365 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.205199 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.219882 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.238140 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.255371 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.255406 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.255416 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.255431 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.255454 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:02Z","lastTransitionTime":"2026-02-20T16:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.271766 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.300270 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.326866 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.342666 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.358839 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.359097 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.359283 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.359310 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.359338 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.359357 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:02Z","lastTransitionTime":"2026-02-20T16:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.372227 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.392862 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.406891 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.423215 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.463259 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.463298 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.463311 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.463332 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.463344 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:02Z","lastTransitionTime":"2026-02-20T16:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.548212 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.567069 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.567575 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.567611 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.567642 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.567687 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:02Z","lastTransitionTime":"2026-02-20T16:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.575607 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.593716 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.612804 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.626368 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.643596 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.669041 4697 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.672195 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd/pods/etcd-crc/status\": read tcp 38.102.83.44:35848->38.102.83.44:6443: use of closed network connection" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.673859 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.673919 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.673940 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.673968 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.673987 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:02Z","lastTransitionTime":"2026-02-20T16:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.705101 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.721602 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.744822 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.762905 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.776506 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.776563 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.776576 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.776598 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.776615 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:02Z","lastTransitionTime":"2026-02-20T16:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.778542 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.794600 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.807537 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.823799 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.834675 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 08:18:28.809319975 +0000 UTC Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.837284 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.876969 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:02 crc kubenswrapper[4697]: E0220 16:32:02.877508 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.878773 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.878810 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.878821 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.878839 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.878851 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:02Z","lastTransitionTime":"2026-02-20T16:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.898620 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.917230 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.931841 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.946631 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.970935 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.981947 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.981990 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.982003 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.982023 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.982039 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:02Z","lastTransitionTime":"2026-02-20T16:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:02 crc kubenswrapper[4697]: I0220 16:32:02.989186 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.008060 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.029457 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.042612 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.073348 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.085183 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.085253 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.085275 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.085301 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.085319 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:03Z","lastTransitionTime":"2026-02-20T16:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.092700 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.106341 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.123471 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerStarted","Data":"af33544fff13f33a347b7c32e294f5f94d3a258cb9b5068f6d03ad4d3d0a4b37"} Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.124209 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.124326 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.127515 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.130294 4697 generic.go:334] "Generic (PLEG): container finished" podID="f571603b-6223-4f16-b5fa-019ef7c4abb6" containerID="6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be" exitCode=0 Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.130347 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" event={"ID":"f571603b-6223-4f16-b5fa-019ef7c4abb6","Type":"ContainerDied","Data":"6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be"} Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.150421 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.160812 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.163916 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.169850 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.186182 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.188607 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.188639 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.188651 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.188673 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.188689 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:03Z","lastTransitionTime":"2026-02-20T16:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.206982 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.225254 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.257191 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.291636 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.291676 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.291688 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.291707 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.291718 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:03Z","lastTransitionTime":"2026-02-20T16:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.295165 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.337150 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.377929 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af33544fff13f33a347b7c32e294f5f94d3a258cb9b5068f6d03ad4d3d0a4b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.393864 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.393899 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.393910 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.393927 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.393939 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:03Z","lastTransitionTime":"2026-02-20T16:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.416145 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.455138 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.496637 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.496722 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.496744 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.496777 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.496800 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:03Z","lastTransitionTime":"2026-02-20T16:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.497713 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.532432 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.577871 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.599678 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.599756 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.599776 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.599860 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.599893 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:03Z","lastTransitionTime":"2026-02-20T16:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.614458 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.652964 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.700030 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.702376 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.702427 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.702473 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.702497 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.702513 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:03Z","lastTransitionTime":"2026-02-20T16:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.806136 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.806187 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.806206 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.806226 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.806239 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:03Z","lastTransitionTime":"2026-02-20T16:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.835382 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 06:52:26.216366196 +0000 UTC Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.876103 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.876179 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:03 crc kubenswrapper[4697]: E0220 16:32:03.876283 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:03 crc kubenswrapper[4697]: E0220 16:32:03.876364 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.908502 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.908577 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.908595 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.908625 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:03 crc kubenswrapper[4697]: I0220 16:32:03.908710 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:03Z","lastTransitionTime":"2026-02-20T16:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.011179 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.011262 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.011273 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.011289 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.011301 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:04Z","lastTransitionTime":"2026-02-20T16:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.114903 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.114985 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.115005 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.115036 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.115057 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:04Z","lastTransitionTime":"2026-02-20T16:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.140186 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.140184 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" event={"ID":"f571603b-6223-4f16-b5fa-019ef7c4abb6","Type":"ContainerStarted","Data":"4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12"} Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.157614 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.172304 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.189005 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.202359 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.217353 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.217402 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.217419 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.217468 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.217484 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:04Z","lastTransitionTime":"2026-02-20T16:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.222605 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.240070 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.260726 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.278810 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.297252 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.314813 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.320257 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.320333 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.320346 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.320371 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.320385 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:04Z","lastTransitionTime":"2026-02-20T16:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.380824 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.397501 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.421779 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.423692 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.423732 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.423744 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.423768 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.423784 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:04Z","lastTransitionTime":"2026-02-20T16:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.440585 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.458090 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af33544fff13f33a347b7c32e294f5f94d3a258cb9b5068f6d03ad4d3d0a4b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:04Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.526550 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.526603 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.526614 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.526634 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.526646 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:04Z","lastTransitionTime":"2026-02-20T16:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.628669 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.628718 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.628727 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.628745 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.628755 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:04Z","lastTransitionTime":"2026-02-20T16:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.730775 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.730806 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.730814 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.730827 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.730837 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:04Z","lastTransitionTime":"2026-02-20T16:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.833090 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.833571 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.833584 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.833605 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.833617 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:04Z","lastTransitionTime":"2026-02-20T16:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.836632 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 21:00:34.566039855 +0000 UTC Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.877099 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:04 crc kubenswrapper[4697]: E0220 16:32:04.877290 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.935591 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.935615 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.935624 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.935637 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:04 crc kubenswrapper[4697]: I0220 16:32:04.935646 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:04Z","lastTransitionTime":"2026-02-20T16:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.038295 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.038331 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.038339 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.038353 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.038362 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:05Z","lastTransitionTime":"2026-02-20T16:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.141859 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.142139 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.142460 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.143254 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.143494 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:05Z","lastTransitionTime":"2026-02-20T16:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.145848 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/0.log" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.149166 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="af33544fff13f33a347b7c32e294f5f94d3a258cb9b5068f6d03ad4d3d0a4b37" exitCode=1 Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.149335 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"af33544fff13f33a347b7c32e294f5f94d3a258cb9b5068f6d03ad4d3d0a4b37"} Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.150211 4697 scope.go:117] "RemoveContainer" containerID="af33544fff13f33a347b7c32e294f5f94d3a258cb9b5068f6d03ad4d3d0a4b37" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.166228 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.190284 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af33544fff13f33a347b7c32e294f5f94d3a258cb9b5068f6d03ad4d3d0a4b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af33544fff13f33a347b7c32e294f5f94d3a258cb9b5068f6d03ad4d3d0a4b37\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:05Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:05.080121 5977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 16:32:05.080151 5977 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0220 16:32:05.080175 5977 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 16:32:05.080182 5977 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 16:32:05.080193 5977 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0220 16:32:05.080197 5977 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 16:32:05.080229 5977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0220 16:32:05.080249 5977 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0220 16:32:05.080237 5977 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 16:32:05.080240 5977 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 16:32:05.080293 5977 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 16:32:05.080257 5977 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 16:32:05.080336 5977 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 16:32:05.080331 5977 factory.go:656] Stopping watch factory\\\\nI0220 16:32:05.080354 5977 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.204623 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.217956 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.228338 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.240454 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.245532 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.245555 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.245563 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.245576 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.245586 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:05Z","lastTransitionTime":"2026-02-20T16:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.259234 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.274273 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.299094 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.317678 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.333802 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.347540 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.349752 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.349836 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.349864 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.349903 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.349931 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:05Z","lastTransitionTime":"2026-02-20T16:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.360462 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.375079 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.389338 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:05Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.452532 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.452573 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.452584 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.452615 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.452626 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:05Z","lastTransitionTime":"2026-02-20T16:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.555399 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.555465 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.555477 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.555498 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.555511 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:05Z","lastTransitionTime":"2026-02-20T16:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.657011 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.657043 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.657051 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.657066 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.657075 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:05Z","lastTransitionTime":"2026-02-20T16:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.759309 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.759384 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.759398 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.759422 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.759478 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:05Z","lastTransitionTime":"2026-02-20T16:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.837115 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 11:33:38.886844284 +0000 UTC Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.861757 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.861806 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.861817 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.861832 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.861843 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:05Z","lastTransitionTime":"2026-02-20T16:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.876994 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:05 crc kubenswrapper[4697]: E0220 16:32:05.877091 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.876994 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:05 crc kubenswrapper[4697]: E0220 16:32:05.877150 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.964742 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.964781 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.964791 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.964809 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:05 crc kubenswrapper[4697]: I0220 16:32:05.964820 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:05Z","lastTransitionTime":"2026-02-20T16:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.066678 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.066721 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.066732 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.066752 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.066764 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:06Z","lastTransitionTime":"2026-02-20T16:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.154704 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/1.log" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.155419 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/0.log" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.157795 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e" exitCode=1 Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.157838 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e"} Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.157889 4697 scope.go:117] "RemoveContainer" containerID="af33544fff13f33a347b7c32e294f5f94d3a258cb9b5068f6d03ad4d3d0a4b37" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.158650 4697 scope.go:117] "RemoveContainer" containerID="5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e" Feb 20 16:32:06 crc kubenswrapper[4697]: E0220 16:32:06.158992 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.169058 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.169103 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.169111 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.169125 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.169136 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:06Z","lastTransitionTime":"2026-02-20T16:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.175216 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.189891 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.211638 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af33544fff13f33a347b7c32e294f5f94d3a258cb9b5068f6d03ad4d3d0a4b37\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:05Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:05.080121 5977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 16:32:05.080151 5977 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0220 16:32:05.080175 5977 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 16:32:05.080182 5977 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 16:32:05.080193 5977 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0220 16:32:05.080197 5977 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 16:32:05.080229 5977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0220 16:32:05.080249 5977 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0220 16:32:05.080237 5977 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 16:32:05.080240 5977 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 16:32:05.080293 5977 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 16:32:05.080257 5977 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 16:32:05.080336 5977 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 16:32:05.080331 5977 factory.go:656] Stopping watch factory\\\\nI0220 16:32:05.080354 5977 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:06Z\\\",\\\"message\\\":\\\"r 0 failed attempt(s)\\\\nI0220 16:32:06.075986 6135 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0220 16:32:06.075985 6135 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0220 16:32:06.076022 6135 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0220 16:32:06.075805 6135 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076029 6135 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0220 16:32:06.076062 6135 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076089 6135 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc in node crc\\\\nI0220 16:32:06.076103 6135 obj\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.229573 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.243555 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.258617 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.270182 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.272216 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.272241 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.272250 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.272264 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.272274 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:06Z","lastTransitionTime":"2026-02-20T16:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.287927 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.301930 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.316000 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.328552 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.350420 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.363871 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.374896 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.374944 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.374962 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.374984 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.375000 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:06Z","lastTransitionTime":"2026-02-20T16:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.383012 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.402798 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:06Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.477016 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.477065 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.477080 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.477103 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.477118 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:06Z","lastTransitionTime":"2026-02-20T16:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.579496 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.579545 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.579556 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.579573 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.579585 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:06Z","lastTransitionTime":"2026-02-20T16:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.683507 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.683553 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.683562 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.683579 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.683589 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:06Z","lastTransitionTime":"2026-02-20T16:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.787857 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.788024 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.788050 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.788088 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.788190 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:06Z","lastTransitionTime":"2026-02-20T16:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.837842 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 11:08:31.512379503 +0000 UTC Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.876712 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:06 crc kubenswrapper[4697]: E0220 16:32:06.877039 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.891410 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.891528 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.891549 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.891583 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.891604 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:06Z","lastTransitionTime":"2026-02-20T16:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.994782 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.994866 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.994887 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.994921 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:06 crc kubenswrapper[4697]: I0220 16:32:06.994942 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:06Z","lastTransitionTime":"2026-02-20T16:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.097757 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.097815 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.097840 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.097869 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.097889 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:07Z","lastTransitionTime":"2026-02-20T16:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.164343 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/1.log" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.170830 4697 scope.go:117] "RemoveContainer" containerID="5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e" Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.172507 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.194908 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.200831 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.200879 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.200900 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.200928 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.200947 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:07Z","lastTransitionTime":"2026-02-20T16:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.214826 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.238171 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.255324 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.269719 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.283138 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.296090 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.302985 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.303029 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.303042 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.303061 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.303073 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:07Z","lastTransitionTime":"2026-02-20T16:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.307561 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.327559 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.346953 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:06Z\\\",\\\"message\\\":\\\"r 0 failed attempt(s)\\\\nI0220 16:32:06.075986 6135 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0220 16:32:06.075985 6135 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0220 16:32:06.076022 6135 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0220 16:32:06.075805 6135 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076029 6135 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0220 16:32:06.076062 6135 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076089 6135 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc in node crc\\\\nI0220 16:32:06.076103 6135 obj\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.362934 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.374618 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.387598 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.403123 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.405593 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.405643 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.405658 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.405679 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.405692 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:07Z","lastTransitionTime":"2026-02-20T16:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.428208 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:07Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.509212 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.509265 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.509279 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.509301 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.509315 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:07Z","lastTransitionTime":"2026-02-20T16:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.613630 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.613696 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.613709 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.613730 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.613753 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:07Z","lastTransitionTime":"2026-02-20T16:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.645143 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.645254 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.645313 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.645428 4697 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.645494 4697 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.645521 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:32:23.645502847 +0000 UTC m=+51.425548265 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.645677 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:32:23.645644561 +0000 UTC m=+51.425690209 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.645716 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:32:23.645702472 +0000 UTC m=+51.425747990 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.717083 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.717151 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.717168 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.717197 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.717221 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:07Z","lastTransitionTime":"2026-02-20T16:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.745998 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.746095 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.746298 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.746325 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.746325 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.746402 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.746428 4697 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.746346 4697 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.746557 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 16:32:23.746534026 +0000 UTC m=+51.526579464 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.746587 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 16:32:23.746573537 +0000 UTC m=+51.526618975 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.820064 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.820114 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.820123 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.820141 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.820152 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:07Z","lastTransitionTime":"2026-02-20T16:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.838423 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 06:26:45.497385433 +0000 UTC Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.876978 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.876978 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.877146 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:07 crc kubenswrapper[4697]: E0220 16:32:07.877222 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.923217 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.923260 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.923271 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.923293 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:07 crc kubenswrapper[4697]: I0220 16:32:07.923307 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:07Z","lastTransitionTime":"2026-02-20T16:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.025918 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.025958 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.025967 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.025985 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.025994 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.128820 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.128887 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.128906 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.128931 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.128945 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.232717 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.232794 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.232818 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.232851 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.232880 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.336407 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.336518 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.336540 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.336582 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.336604 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.440391 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.440451 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.440461 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.440480 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.440492 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.476385 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.476492 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.476514 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.476554 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.476575 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: E0220 16:32:08.497000 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:08Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.503630 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.503706 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.503727 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.503788 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.503810 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: E0220 16:32:08.528547 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:08Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.533803 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.533857 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.533876 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.533900 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.533947 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: E0220 16:32:08.552925 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:08Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.558484 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.558567 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.558594 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.558625 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.558646 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: E0220 16:32:08.574226 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:08Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.579703 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.579747 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.579766 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.579798 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.579821 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: E0220 16:32:08.599107 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:08Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:08 crc kubenswrapper[4697]: E0220 16:32:08.599268 4697 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.601300 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.601343 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.601359 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.601381 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.601396 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.703686 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.703744 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.703761 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.703790 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.703810 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.807648 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.807734 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.807757 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.807791 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.807812 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.838918 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 23:20:16.37092215 +0000 UTC Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.876239 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:08 crc kubenswrapper[4697]: E0220 16:32:08.876485 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.911979 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.912030 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.912050 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.912074 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:08 crc kubenswrapper[4697]: I0220 16:32:08.912094 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:08Z","lastTransitionTime":"2026-02-20T16:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.014416 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.014480 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.014492 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.014511 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.014524 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:09Z","lastTransitionTime":"2026-02-20T16:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.117319 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.117405 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.117502 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.117546 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.117573 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:09Z","lastTransitionTime":"2026-02-20T16:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.220779 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.220850 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.220870 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.220900 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.221055 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:09Z","lastTransitionTime":"2026-02-20T16:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.323970 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.324011 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.324022 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.324042 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.324054 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:09Z","lastTransitionTime":"2026-02-20T16:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.426979 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.427073 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.427093 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.427128 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.427148 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:09Z","lastTransitionTime":"2026-02-20T16:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.538701 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.538829 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.538852 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.538887 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.538907 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:09Z","lastTransitionTime":"2026-02-20T16:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.642738 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.642812 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.642840 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.642874 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.642900 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:09Z","lastTransitionTime":"2026-02-20T16:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.682378 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx"] Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.683285 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.686243 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.686769 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.703865 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.724731 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.742502 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.746294 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.746342 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.746364 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.746388 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.746408 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:09Z","lastTransitionTime":"2026-02-20T16:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.763210 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.779263 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.794499 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb182755-c6aa-48db-8be3-2c3e23b4b41b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2x8hx\" (UID: \"cb182755-c6aa-48db-8be3-2c3e23b4b41b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.794579 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb182755-c6aa-48db-8be3-2c3e23b4b41b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2x8hx\" (UID: \"cb182755-c6aa-48db-8be3-2c3e23b4b41b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.794619 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssvmq\" (UniqueName: \"kubernetes.io/projected/cb182755-c6aa-48db-8be3-2c3e23b4b41b-kube-api-access-ssvmq\") pod \"ovnkube-control-plane-749d76644c-2x8hx\" (UID: \"cb182755-c6aa-48db-8be3-2c3e23b4b41b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.794817 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb182755-c6aa-48db-8be3-2c3e23b4b41b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2x8hx\" (UID: \"cb182755-c6aa-48db-8be3-2c3e23b4b41b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.797695 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.817918 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.839165 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 02:45:00.322126585 +0000 UTC Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.842116 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:06Z\\\",\\\"message\\\":\\\"r 0 failed attempt(s)\\\\nI0220 16:32:06.075986 6135 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0220 16:32:06.075985 6135 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0220 16:32:06.076022 6135 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0220 16:32:06.075805 6135 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076029 6135 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0220 16:32:06.076062 6135 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076089 6135 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc in node crc\\\\nI0220 16:32:06.076103 6135 obj\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.849390 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.849483 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.849510 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.849541 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.849568 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:09Z","lastTransitionTime":"2026-02-20T16:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.859303 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.876787 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.876795 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:09 crc kubenswrapper[4697]: E0220 16:32:09.877013 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:09 crc kubenswrapper[4697]: E0220 16:32:09.877172 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.878544 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.896156 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.896557 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb182755-c6aa-48db-8be3-2c3e23b4b41b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2x8hx\" (UID: \"cb182755-c6aa-48db-8be3-2c3e23b4b41b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.896631 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb182755-c6aa-48db-8be3-2c3e23b4b41b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2x8hx\" (UID: \"cb182755-c6aa-48db-8be3-2c3e23b4b41b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.896694 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb182755-c6aa-48db-8be3-2c3e23b4b41b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2x8hx\" (UID: \"cb182755-c6aa-48db-8be3-2c3e23b4b41b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.896745 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvmq\" (UniqueName: \"kubernetes.io/projected/cb182755-c6aa-48db-8be3-2c3e23b4b41b-kube-api-access-ssvmq\") pod \"ovnkube-control-plane-749d76644c-2x8hx\" (UID: \"cb182755-c6aa-48db-8be3-2c3e23b4b41b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.897887 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb182755-c6aa-48db-8be3-2c3e23b4b41b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2x8hx\" (UID: \"cb182755-c6aa-48db-8be3-2c3e23b4b41b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.898166 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb182755-c6aa-48db-8be3-2c3e23b4b41b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2x8hx\" (UID: \"cb182755-c6aa-48db-8be3-2c3e23b4b41b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.910027 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb182755-c6aa-48db-8be3-2c3e23b4b41b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2x8hx\" (UID: \"cb182755-c6aa-48db-8be3-2c3e23b4b41b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.915355 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.928588 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssvmq\" (UniqueName: \"kubernetes.io/projected/cb182755-c6aa-48db-8be3-2c3e23b4b41b-kube-api-access-ssvmq\") pod \"ovnkube-control-plane-749d76644c-2x8hx\" (UID: \"cb182755-c6aa-48db-8be3-2c3e23b4b41b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.932290 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.953354 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.953403 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.953416 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.953468 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.953488 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:09Z","lastTransitionTime":"2026-02-20T16:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.967478 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:09 crc kubenswrapper[4697]: I0220 16:32:09.991810 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:09Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.007012 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.014027 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.056688 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.056925 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.057010 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.057121 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.057205 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:10Z","lastTransitionTime":"2026-02-20T16:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.164177 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.164648 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.164674 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.164709 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.164907 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:10Z","lastTransitionTime":"2026-02-20T16:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.186046 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" event={"ID":"cb182755-c6aa-48db-8be3-2c3e23b4b41b","Type":"ContainerStarted","Data":"bde2caf14b4ba677576103ea420552bd38d4b5b30ae40b51f48dca7af4e69cd2"} Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.267788 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.267831 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.267847 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.267869 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.267884 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:10Z","lastTransitionTime":"2026-02-20T16:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.370917 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.371154 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.371251 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.371343 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.371419 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:10Z","lastTransitionTime":"2026-02-20T16:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.453664 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-nskrw"] Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.454156 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:10 crc kubenswrapper[4697]: E0220 16:32:10.454224 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.474620 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.474719 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.474747 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.474787 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.474818 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:10Z","lastTransitionTime":"2026-02-20T16:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.481297 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.505968 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.506047 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg8hj\" (UniqueName: \"kubernetes.io/projected/0aff33f1-a871-41df-a6f1-fd7146e23a9c-kube-api-access-bg8hj\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.506143 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.527587 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.549278 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.572409 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.578599 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.578655 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.578674 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.578702 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.578721 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:10Z","lastTransitionTime":"2026-02-20T16:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.594508 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.605044 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.606805 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.607062 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg8hj\" (UniqueName: \"kubernetes.io/projected/0aff33f1-a871-41df-a6f1-fd7146e23a9c-kube-api-access-bg8hj\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:10 crc kubenswrapper[4697]: E0220 16:32:10.607073 4697 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:10 crc kubenswrapper[4697]: E0220 16:32:10.607405 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs podName:0aff33f1-a871-41df-a6f1-fd7146e23a9c nodeName:}" failed. No retries permitted until 2026-02-20 16:32:11.107380623 +0000 UTC m=+38.887426041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs") pod "network-metrics-daemon-nskrw" (UID: "0aff33f1-a871-41df-a6f1-fd7146e23a9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.607946 4697 scope.go:117] "RemoveContainer" containerID="5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e" Feb 20 16:32:10 crc kubenswrapper[4697]: E0220 16:32:10.608276 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.614769 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.631602 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg8hj\" (UniqueName: \"kubernetes.io/projected/0aff33f1-a871-41df-a6f1-fd7146e23a9c-kube-api-access-bg8hj\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.638355 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.652004 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.671388 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.681859 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.681934 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.681953 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.681985 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.682004 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:10Z","lastTransitionTime":"2026-02-20T16:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.687974 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.708617 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.727664 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.739521 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.756532 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.784694 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:06Z\\\",\\\"message\\\":\\\"r 0 failed attempt(s)\\\\nI0220 16:32:06.075986 6135 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0220 16:32:06.075985 6135 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0220 16:32:06.076022 6135 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0220 16:32:06.075805 6135 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076029 6135 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0220 16:32:06.076062 6135 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076089 6135 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc in node crc\\\\nI0220 16:32:06.076103 6135 obj\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.786161 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.786227 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.786256 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.786287 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.786306 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:10Z","lastTransitionTime":"2026-02-20T16:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.802664 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:10Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.840250 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 19:22:13.546005015 +0000 UTC Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.876677 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:10 crc kubenswrapper[4697]: E0220 16:32:10.876844 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.888887 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.888939 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.888956 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.888976 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.888993 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:10Z","lastTransitionTime":"2026-02-20T16:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.991859 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.991913 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.991924 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.991942 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:10 crc kubenswrapper[4697]: I0220 16:32:10.991954 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:10Z","lastTransitionTime":"2026-02-20T16:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.095283 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.095340 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.095366 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.095389 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.095404 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:11Z","lastTransitionTime":"2026-02-20T16:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.113327 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:11 crc kubenswrapper[4697]: E0220 16:32:11.113564 4697 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:11 crc kubenswrapper[4697]: E0220 16:32:11.113652 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs podName:0aff33f1-a871-41df-a6f1-fd7146e23a9c nodeName:}" failed. No retries permitted until 2026-02-20 16:32:12.11362679 +0000 UTC m=+39.893672238 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs") pod "network-metrics-daemon-nskrw" (UID: "0aff33f1-a871-41df-a6f1-fd7146e23a9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.192585 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" event={"ID":"cb182755-c6aa-48db-8be3-2c3e23b4b41b","Type":"ContainerStarted","Data":"1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e"} Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.192900 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" event={"ID":"cb182755-c6aa-48db-8be3-2c3e23b4b41b","Type":"ContainerStarted","Data":"8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301"} Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.198748 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.198816 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.198843 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.198886 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.198906 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:11Z","lastTransitionTime":"2026-02-20T16:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.217403 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.235151 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.250830 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.264911 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.279274 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.295000 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.303414 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.303636 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.303759 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.303865 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.303967 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:11Z","lastTransitionTime":"2026-02-20T16:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.319413 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:06Z\\\",\\\"message\\\":\\\"r 0 failed attempt(s)\\\\nI0220 16:32:06.075986 6135 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0220 16:32:06.075985 6135 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0220 16:32:06.076022 6135 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0220 16:32:06.075805 6135 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076029 6135 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0220 16:32:06.076062 6135 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076089 6135 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc in node crc\\\\nI0220 16:32:06.076103 6135 obj\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.334609 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.366022 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.382522 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.395996 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.407125 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.407161 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.407172 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.407188 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.407200 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:11Z","lastTransitionTime":"2026-02-20T16:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.412503 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.423651 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.434533 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.446639 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.457743 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.467822 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:11Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.510269 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.510309 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.510320 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.510336 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.510348 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:11Z","lastTransitionTime":"2026-02-20T16:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.613076 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.613118 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.613128 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.613144 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.613153 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:11Z","lastTransitionTime":"2026-02-20T16:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.715501 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.715575 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.715598 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.715638 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.715672 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:11Z","lastTransitionTime":"2026-02-20T16:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.819626 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.819681 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.819698 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.819722 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.819738 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:11Z","lastTransitionTime":"2026-02-20T16:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.841323 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 05:34:07.222371544 +0000 UTC Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.876817 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.876896 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:11 crc kubenswrapper[4697]: E0220 16:32:11.877393 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.876965 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:11 crc kubenswrapper[4697]: E0220 16:32:11.877575 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:11 crc kubenswrapper[4697]: E0220 16:32:11.877416 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.922275 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.922600 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.922636 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.922664 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:11 crc kubenswrapper[4697]: I0220 16:32:11.922685 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:11Z","lastTransitionTime":"2026-02-20T16:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.026385 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.026504 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.026527 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.026560 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.026580 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:12Z","lastTransitionTime":"2026-02-20T16:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.123398 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:12 crc kubenswrapper[4697]: E0220 16:32:12.123676 4697 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:12 crc kubenswrapper[4697]: E0220 16:32:12.123813 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs podName:0aff33f1-a871-41df-a6f1-fd7146e23a9c nodeName:}" failed. No retries permitted until 2026-02-20 16:32:14.123780575 +0000 UTC m=+41.903826023 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs") pod "network-metrics-daemon-nskrw" (UID: "0aff33f1-a871-41df-a6f1-fd7146e23a9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.131919 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.131975 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.131988 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.132010 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.132023 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:12Z","lastTransitionTime":"2026-02-20T16:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.234281 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.234398 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.234425 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.234487 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.234510 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:12Z","lastTransitionTime":"2026-02-20T16:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.337177 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.337250 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.337270 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.337298 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.337319 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:12Z","lastTransitionTime":"2026-02-20T16:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.439859 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.439932 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.439952 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.439983 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.440006 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:12Z","lastTransitionTime":"2026-02-20T16:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.542930 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.542992 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.543009 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.543035 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.543054 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:12Z","lastTransitionTime":"2026-02-20T16:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.646679 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.646768 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.646808 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.646849 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.646875 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:12Z","lastTransitionTime":"2026-02-20T16:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.751293 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.751734 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.751902 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.752055 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.752208 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:12Z","lastTransitionTime":"2026-02-20T16:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.842176 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 13:54:14.125006824 +0000 UTC Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.856242 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.856324 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.856346 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.856379 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.856406 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:12Z","lastTransitionTime":"2026-02-20T16:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.876854 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:12 crc kubenswrapper[4697]: E0220 16:32:12.877081 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.905785 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:12Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.927255 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:12Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.953213 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:12Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.960816 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.960871 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.960890 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.960917 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.960934 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:12Z","lastTransitionTime":"2026-02-20T16:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:12 crc kubenswrapper[4697]: I0220 16:32:12.987213 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:12Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.018050 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.033546 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.050907 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.064249 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.064351 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.064425 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.064524 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.064600 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:13Z","lastTransitionTime":"2026-02-20T16:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.076417 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:06Z\\\",\\\"message\\\":\\\"r 0 failed attempt(s)\\\\nI0220 16:32:06.075986 6135 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0220 16:32:06.075985 6135 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0220 16:32:06.076022 6135 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0220 16:32:06.075805 6135 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076029 6135 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0220 16:32:06.076062 6135 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076089 6135 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc in node crc\\\\nI0220 16:32:06.076103 6135 obj\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.091396 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.107902 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.123431 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.135970 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.149964 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.168165 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.168249 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.168279 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.168319 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.168348 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:13Z","lastTransitionTime":"2026-02-20T16:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.172094 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.191637 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.209456 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.232326 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:13Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.272014 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.272062 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.272077 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.272098 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.272114 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:13Z","lastTransitionTime":"2026-02-20T16:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.375501 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.375783 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.376064 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.376163 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.376241 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:13Z","lastTransitionTime":"2026-02-20T16:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.479730 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.479832 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.479853 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.479884 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.479905 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:13Z","lastTransitionTime":"2026-02-20T16:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.583342 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.583417 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.583467 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.583501 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.583526 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:13Z","lastTransitionTime":"2026-02-20T16:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.686860 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.686958 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.686984 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.687021 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.687045 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:13Z","lastTransitionTime":"2026-02-20T16:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.790916 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.790979 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.790999 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.791032 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.791053 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:13Z","lastTransitionTime":"2026-02-20T16:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.842850 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 06:27:24.158931049 +0000 UTC Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.876868 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.876927 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.876892 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:13 crc kubenswrapper[4697]: E0220 16:32:13.877114 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:13 crc kubenswrapper[4697]: E0220 16:32:13.877318 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:13 crc kubenswrapper[4697]: E0220 16:32:13.877607 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.895406 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.895501 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.895527 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.895559 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.895583 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:13Z","lastTransitionTime":"2026-02-20T16:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.999031 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.999107 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.999127 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.999158 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:13 crc kubenswrapper[4697]: I0220 16:32:13.999182 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:13Z","lastTransitionTime":"2026-02-20T16:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.101903 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.101960 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.101981 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.102005 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.102022 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:14Z","lastTransitionTime":"2026-02-20T16:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.149233 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:14 crc kubenswrapper[4697]: E0220 16:32:14.149500 4697 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:14 crc kubenswrapper[4697]: E0220 16:32:14.149584 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs podName:0aff33f1-a871-41df-a6f1-fd7146e23a9c nodeName:}" failed. No retries permitted until 2026-02-20 16:32:18.149561733 +0000 UTC m=+45.929607171 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs") pod "network-metrics-daemon-nskrw" (UID: "0aff33f1-a871-41df-a6f1-fd7146e23a9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.204846 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.204975 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.205001 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.205041 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.205068 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:14Z","lastTransitionTime":"2026-02-20T16:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.310261 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.310337 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.310357 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.310400 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.310421 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:14Z","lastTransitionTime":"2026-02-20T16:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.414246 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.414345 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.414372 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.414419 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.414482 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:14Z","lastTransitionTime":"2026-02-20T16:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.518701 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.518778 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.518791 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.518812 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.518827 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:14Z","lastTransitionTime":"2026-02-20T16:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.622827 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.622897 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.622916 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.622945 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.622968 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:14Z","lastTransitionTime":"2026-02-20T16:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.733879 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.733963 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.733984 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.734021 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.734043 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:14Z","lastTransitionTime":"2026-02-20T16:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.839511 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.839808 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.839933 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.840067 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.840184 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:14Z","lastTransitionTime":"2026-02-20T16:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.843672 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 11:10:31.81918469 +0000 UTC Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.876738 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:14 crc kubenswrapper[4697]: E0220 16:32:14.877013 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.943424 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.943528 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.943552 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.943587 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:14 crc kubenswrapper[4697]: I0220 16:32:14.943612 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:14Z","lastTransitionTime":"2026-02-20T16:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.047354 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.047421 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.047480 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.047512 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.047533 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:15Z","lastTransitionTime":"2026-02-20T16:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.150621 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.150675 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.150688 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.150705 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.150716 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:15Z","lastTransitionTime":"2026-02-20T16:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.254393 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.254482 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.254493 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.254512 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.254520 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:15Z","lastTransitionTime":"2026-02-20T16:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.356361 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.356411 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.356421 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.356460 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.356471 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:15Z","lastTransitionTime":"2026-02-20T16:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.460071 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.460142 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.460160 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.460189 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.460211 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:15Z","lastTransitionTime":"2026-02-20T16:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.564950 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.565043 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.565090 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.565120 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.565140 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:15Z","lastTransitionTime":"2026-02-20T16:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.670717 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.670808 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.670830 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.670866 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.670889 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:15Z","lastTransitionTime":"2026-02-20T16:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.775142 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.775191 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.775200 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.775218 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.775228 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:15Z","lastTransitionTime":"2026-02-20T16:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.844419 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 04:54:44.97932545 +0000 UTC Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.876730 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.876778 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.876754 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:15 crc kubenswrapper[4697]: E0220 16:32:15.876993 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:15 crc kubenswrapper[4697]: E0220 16:32:15.877184 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:15 crc kubenswrapper[4697]: E0220 16:32:15.877320 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.880601 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.880662 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.880685 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.880712 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.880733 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:15Z","lastTransitionTime":"2026-02-20T16:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.984253 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.984301 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.984313 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.984332 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:15 crc kubenswrapper[4697]: I0220 16:32:15.984342 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:15Z","lastTransitionTime":"2026-02-20T16:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.088411 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.088505 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.088524 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.088552 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.088569 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:16Z","lastTransitionTime":"2026-02-20T16:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.192624 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.192699 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.192718 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.192760 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.192784 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:16Z","lastTransitionTime":"2026-02-20T16:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.296247 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.296318 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.296338 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.296369 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.296389 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:16Z","lastTransitionTime":"2026-02-20T16:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.399510 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.399600 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.399621 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.399654 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.399679 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:16Z","lastTransitionTime":"2026-02-20T16:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.503790 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.503856 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.503877 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.503906 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.503924 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:16Z","lastTransitionTime":"2026-02-20T16:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.608557 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.608723 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.608743 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.608772 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.608789 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:16Z","lastTransitionTime":"2026-02-20T16:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.712771 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.712805 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.712821 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.712841 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.712851 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:16Z","lastTransitionTime":"2026-02-20T16:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.815562 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.815646 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.815667 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.815696 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.815715 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:16Z","lastTransitionTime":"2026-02-20T16:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.845213 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 19:05:58.647010269 +0000 UTC Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.876935 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:16 crc kubenswrapper[4697]: E0220 16:32:16.877113 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.918535 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.918606 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.918630 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.918662 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:16 crc kubenswrapper[4697]: I0220 16:32:16.918686 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:16Z","lastTransitionTime":"2026-02-20T16:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.021558 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.021625 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.021643 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.021672 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.021690 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:17Z","lastTransitionTime":"2026-02-20T16:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.127809 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.127907 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.127941 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.127964 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.127974 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:17Z","lastTransitionTime":"2026-02-20T16:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.230855 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.230938 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.230959 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.230991 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.231011 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:17Z","lastTransitionTime":"2026-02-20T16:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.334499 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.334550 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.334566 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.334580 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.334589 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:17Z","lastTransitionTime":"2026-02-20T16:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.437563 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.437595 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.437603 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.437617 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.437626 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:17Z","lastTransitionTime":"2026-02-20T16:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.541060 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.541140 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.541160 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.541191 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.541210 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:17Z","lastTransitionTime":"2026-02-20T16:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.644920 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.644964 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.644976 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.644995 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.645007 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:17Z","lastTransitionTime":"2026-02-20T16:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.748005 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.748079 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.748095 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.748121 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.748141 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:17Z","lastTransitionTime":"2026-02-20T16:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.845631 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 18:11:02.110321719 +0000 UTC Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.851408 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.851540 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.851573 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.851593 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.851604 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:17Z","lastTransitionTime":"2026-02-20T16:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.877545 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.877388 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.877733 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:17 crc kubenswrapper[4697]: E0220 16:32:17.877786 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:17 crc kubenswrapper[4697]: E0220 16:32:17.877881 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:17 crc kubenswrapper[4697]: E0220 16:32:17.877957 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.955233 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.955294 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.955305 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.955338 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:17 crc kubenswrapper[4697]: I0220 16:32:17.955351 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:17Z","lastTransitionTime":"2026-02-20T16:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.059340 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.059401 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.059544 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.059573 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.059590 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.163025 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.163102 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.163133 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.163186 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.163220 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.203535 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:18 crc kubenswrapper[4697]: E0220 16:32:18.203699 4697 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:18 crc kubenswrapper[4697]: E0220 16:32:18.203770 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs podName:0aff33f1-a871-41df-a6f1-fd7146e23a9c nodeName:}" failed. No retries permitted until 2026-02-20 16:32:26.203751521 +0000 UTC m=+53.983796939 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs") pod "network-metrics-daemon-nskrw" (UID: "0aff33f1-a871-41df-a6f1-fd7146e23a9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.266583 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.266658 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.266677 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.266703 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.266723 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.370187 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.370312 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.370340 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.370366 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.370385 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.473674 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.473729 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.473741 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.473762 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.473778 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.576549 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.576595 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.576607 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.576625 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.576637 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.680964 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.681063 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.681084 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.681121 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.681145 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.785620 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.785689 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.785706 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.785733 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.785750 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.846407 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 17:15:05.297588229 +0000 UTC Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.878007 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:18 crc kubenswrapper[4697]: E0220 16:32:18.878252 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.889138 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.889202 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.889216 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.889253 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.889266 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.925170 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.925274 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.925477 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.925503 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.925517 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: E0220 16:32:18.946425 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:18Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.951591 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.951648 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.951666 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.951690 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.951706 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: E0220 16:32:18.968878 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:18Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.974054 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.974151 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.974177 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.974239 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.974258 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:18 crc kubenswrapper[4697]: E0220 16:32:18.991965 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:18Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.998390 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.998504 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.998530 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.998563 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:18 crc kubenswrapper[4697]: I0220 16:32:18.998587 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:18Z","lastTransitionTime":"2026-02-20T16:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:19 crc kubenswrapper[4697]: E0220 16:32:19.017704 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:19Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.024515 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.024783 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.024868 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.024962 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.025040 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:19Z","lastTransitionTime":"2026-02-20T16:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:19 crc kubenswrapper[4697]: E0220 16:32:19.041516 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:19Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:19 crc kubenswrapper[4697]: E0220 16:32:19.041686 4697 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.044058 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.044119 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.044134 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.044155 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.044170 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:19Z","lastTransitionTime":"2026-02-20T16:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.146726 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.146805 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.146827 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.146854 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.146869 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:19Z","lastTransitionTime":"2026-02-20T16:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.250056 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.250109 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.250121 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.250144 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.250156 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:19Z","lastTransitionTime":"2026-02-20T16:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.352882 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.352939 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.352954 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.352976 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.352990 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:19Z","lastTransitionTime":"2026-02-20T16:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.455905 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.455966 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.455978 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.455999 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.456013 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:19Z","lastTransitionTime":"2026-02-20T16:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.559479 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.559538 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.559555 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.559582 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.559601 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:19Z","lastTransitionTime":"2026-02-20T16:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.662411 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.662482 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.662495 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.662513 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.662526 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:19Z","lastTransitionTime":"2026-02-20T16:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.766685 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.766761 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.766780 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.766809 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.766827 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:19Z","lastTransitionTime":"2026-02-20T16:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.846743 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 05:22:13.430669534 +0000 UTC Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.870097 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.870136 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.870148 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.870163 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.870173 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:19Z","lastTransitionTime":"2026-02-20T16:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.877046 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:19 crc kubenswrapper[4697]: E0220 16:32:19.877211 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.877493 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.877676 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:19 crc kubenswrapper[4697]: E0220 16:32:19.877781 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:19 crc kubenswrapper[4697]: E0220 16:32:19.877976 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.973419 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.973536 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.973559 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.973591 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:19 crc kubenswrapper[4697]: I0220 16:32:19.973613 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:19Z","lastTransitionTime":"2026-02-20T16:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.076915 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.076980 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.076998 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.077032 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.077049 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:20Z","lastTransitionTime":"2026-02-20T16:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.179934 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.180233 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.180308 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.180397 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.180515 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:20Z","lastTransitionTime":"2026-02-20T16:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.282828 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.283090 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.283170 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.283365 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.283473 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:20Z","lastTransitionTime":"2026-02-20T16:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.387246 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.387286 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.387297 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.387315 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.387333 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:20Z","lastTransitionTime":"2026-02-20T16:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.490429 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.490552 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.490577 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.490613 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.490637 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:20Z","lastTransitionTime":"2026-02-20T16:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.593834 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.593883 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.593897 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.593916 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.593926 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:20Z","lastTransitionTime":"2026-02-20T16:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.697545 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.697586 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.697597 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.697615 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.697627 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:20Z","lastTransitionTime":"2026-02-20T16:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.800294 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.800330 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.800340 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.800356 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.800368 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:20Z","lastTransitionTime":"2026-02-20T16:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.847322 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 01:13:54.486590958 +0000 UTC Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.876812 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:20 crc kubenswrapper[4697]: E0220 16:32:20.877132 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.902238 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.902319 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.902332 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.902349 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:20 crc kubenswrapper[4697]: I0220 16:32:20.902363 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:20Z","lastTransitionTime":"2026-02-20T16:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.005446 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.005505 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.005519 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.005541 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.005554 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:21Z","lastTransitionTime":"2026-02-20T16:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.108487 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.108533 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.108545 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.108562 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.108575 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:21Z","lastTransitionTime":"2026-02-20T16:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.211650 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.211735 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.211757 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.211792 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.211814 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:21Z","lastTransitionTime":"2026-02-20T16:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.316512 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.316557 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.316571 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.316594 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.316610 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:21Z","lastTransitionTime":"2026-02-20T16:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.420504 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.420567 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.420581 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.420605 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.420620 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:21Z","lastTransitionTime":"2026-02-20T16:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.523209 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.523264 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.523282 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.523312 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.523331 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:21Z","lastTransitionTime":"2026-02-20T16:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.627277 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.627366 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.627503 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.627537 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.627572 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:21Z","lastTransitionTime":"2026-02-20T16:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.731272 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.731335 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.731351 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.731390 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.731406 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:21Z","lastTransitionTime":"2026-02-20T16:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.834930 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.835004 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.835027 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.835065 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.835091 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:21Z","lastTransitionTime":"2026-02-20T16:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.847479 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 14:33:52.215823593 +0000 UTC Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.876883 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.876991 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:21 crc kubenswrapper[4697]: E0220 16:32:21.877083 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.877096 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:21 crc kubenswrapper[4697]: E0220 16:32:21.877186 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:21 crc kubenswrapper[4697]: E0220 16:32:21.877226 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.938177 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.938239 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.938260 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.938295 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:21 crc kubenswrapper[4697]: I0220 16:32:21.938317 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:21Z","lastTransitionTime":"2026-02-20T16:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.042864 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.042928 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.042950 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.042982 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.043007 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:22Z","lastTransitionTime":"2026-02-20T16:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.146843 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.146917 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.146937 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.146970 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.146993 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:22Z","lastTransitionTime":"2026-02-20T16:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.250054 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.250101 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.250118 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.250142 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.250163 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:22Z","lastTransitionTime":"2026-02-20T16:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.353081 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.353188 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.353200 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.353220 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.353230 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:22Z","lastTransitionTime":"2026-02-20T16:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.457835 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.458283 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.458429 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.458814 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.458963 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:22Z","lastTransitionTime":"2026-02-20T16:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.562971 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.563043 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.563062 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.563090 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.563111 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:22Z","lastTransitionTime":"2026-02-20T16:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.666359 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.667030 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.667215 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.667687 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.667854 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:22Z","lastTransitionTime":"2026-02-20T16:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.771771 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.772090 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.772243 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.773007 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.773168 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:22Z","lastTransitionTime":"2026-02-20T16:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.848196 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 03:25:37.612899049 +0000 UTC Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.878710 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:22 crc kubenswrapper[4697]: E0220 16:32:22.878918 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.878467 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.879307 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.879335 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.879371 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.879396 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:22Z","lastTransitionTime":"2026-02-20T16:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.902701 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:22Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.924565 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:22Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.947896 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:22Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.972599 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:22Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.987201 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.987327 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.987409 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.987506 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.987585 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:22Z","lastTransitionTime":"2026-02-20T16:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:22 crc kubenswrapper[4697]: I0220 16:32:22.992265 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:22Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.019150 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:23Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.088128 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:06Z\\\",\\\"message\\\":\\\"r 0 failed attempt(s)\\\\nI0220 16:32:06.075986 6135 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0220 16:32:06.075985 6135 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0220 16:32:06.076022 6135 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0220 16:32:06.075805 6135 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076029 6135 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0220 16:32:06.076062 6135 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076089 6135 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc in node crc\\\\nI0220 16:32:06.076103 6135 obj\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:23Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.091796 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.092212 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.092463 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.092651 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.092790 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:23Z","lastTransitionTime":"2026-02-20T16:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.110565 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:23Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.137196 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:23Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.158796 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:23Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.185985 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:23Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.195723 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.195807 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.195834 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.195868 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.195889 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:23Z","lastTransitionTime":"2026-02-20T16:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.208851 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:23Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.250417 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:23Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.275068 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:23Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.293019 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:23Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.300803 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.300876 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.300897 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.300929 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.300950 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:23Z","lastTransitionTime":"2026-02-20T16:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.315178 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:23Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.338999 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:23Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.404949 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.405037 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.405063 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.405098 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.405124 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:23Z","lastTransitionTime":"2026-02-20T16:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.509566 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.509683 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.509706 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.509734 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.509753 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:23Z","lastTransitionTime":"2026-02-20T16:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.613101 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.613175 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.613195 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.613223 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.613241 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:23Z","lastTransitionTime":"2026-02-20T16:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.668427 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.668680 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:32:55.668646297 +0000 UTC m=+83.448691705 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.668730 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.668798 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.668931 4697 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.668955 4697 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.669001 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:32:55.668990426 +0000 UTC m=+83.449035834 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.669021 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:32:55.669012736 +0000 UTC m=+83.449058144 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.717725 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.717784 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.717806 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.717837 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.717859 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:23Z","lastTransitionTime":"2026-02-20T16:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.769964 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.770086 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.770340 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.770387 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.770409 4697 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.770403 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.770502 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.770527 4697 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.770538 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 16:32:55.770510658 +0000 UTC m=+83.550556106 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.770615 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 16:32:55.770586169 +0000 UTC m=+83.550631677 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.821521 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.821588 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.821607 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.821637 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.821657 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:23Z","lastTransitionTime":"2026-02-20T16:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.849399 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 06:59:04.712080663 +0000 UTC Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.876996 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.877047 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.877193 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.877271 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.877466 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:23 crc kubenswrapper[4697]: E0220 16:32:23.878069 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.878633 4697 scope.go:117] "RemoveContainer" containerID="5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.926170 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.926252 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.926281 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.926316 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:23 crc kubenswrapper[4697]: I0220 16:32:23.926370 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:23Z","lastTransitionTime":"2026-02-20T16:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.029500 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.029577 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.029593 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.029623 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.029641 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:24Z","lastTransitionTime":"2026-02-20T16:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.133398 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.133489 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.133507 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.133531 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.133550 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:24Z","lastTransitionTime":"2026-02-20T16:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.237091 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.237155 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.237170 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.237195 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.237210 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:24Z","lastTransitionTime":"2026-02-20T16:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.247934 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/1.log" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.251805 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerStarted","Data":"fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a"} Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.252416 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.275493 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.297530 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.316303 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.340483 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.340547 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.340563 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.340594 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.340612 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:24Z","lastTransitionTime":"2026-02-20T16:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.346003 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.376404 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:06Z\\\",\\\"message\\\":\\\"r 0 failed attempt(s)\\\\nI0220 16:32:06.075986 6135 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0220 16:32:06.075985 6135 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0220 16:32:06.076022 6135 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0220 16:32:06.075805 6135 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076029 6135 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0220 16:32:06.076062 6135 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076089 6135 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc in node crc\\\\nI0220 16:32:06.076103 6135 obj\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.397373 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.414505 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.439768 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.444176 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.444244 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.444266 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.444293 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.444311 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:24Z","lastTransitionTime":"2026-02-20T16:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.465931 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.485148 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.512698 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.531160 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.544889 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.547110 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.547147 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.547161 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.547189 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.547205 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:24Z","lastTransitionTime":"2026-02-20T16:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.559154 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.578442 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.593721 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.606207 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:24Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.650836 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.650909 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.650924 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.650954 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.650970 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:24Z","lastTransitionTime":"2026-02-20T16:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.753907 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.753964 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.753975 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.753997 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.754010 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:24Z","lastTransitionTime":"2026-02-20T16:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.849945 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 16:49:35.545659762 +0000 UTC Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.857393 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.857426 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.857449 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.857469 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.857479 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:24Z","lastTransitionTime":"2026-02-20T16:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.877401 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:24 crc kubenswrapper[4697]: E0220 16:32:24.877756 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.961023 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.961086 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.961103 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.961132 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:24 crc kubenswrapper[4697]: I0220 16:32:24.961150 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:24Z","lastTransitionTime":"2026-02-20T16:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.064689 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.064760 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.064777 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.064803 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.064823 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:25Z","lastTransitionTime":"2026-02-20T16:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.169139 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.169236 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.169257 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.169290 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.169312 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:25Z","lastTransitionTime":"2026-02-20T16:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.258266 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/2.log" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.259310 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/1.log" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.263979 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a" exitCode=1 Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.264077 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a"} Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.264176 4697 scope.go:117] "RemoveContainer" containerID="5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.265427 4697 scope.go:117] "RemoveContainer" containerID="fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a" Feb 20 16:32:25 crc kubenswrapper[4697]: E0220 16:32:25.265754 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.272988 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.273056 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.273082 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.273116 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.273146 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:25Z","lastTransitionTime":"2026-02-20T16:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.290933 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.316174 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.341869 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.356982 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.376898 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.376954 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.376973 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.377000 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.377022 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:25Z","lastTransitionTime":"2026-02-20T16:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.380538 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.416483 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5080690ce6411970189c09908111338d6cfa9e2c5c191546755f2d4da3dc356e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:06Z\\\",\\\"message\\\":\\\"r 0 failed attempt(s)\\\\nI0220 16:32:06.075986 6135 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0220 16:32:06.075985 6135 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0220 16:32:06.076022 6135 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0220 16:32:06.075805 6135 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076029 6135 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0220 16:32:06.076062 6135 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc\\\\nI0220 16:32:06.076089 6135 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-9zpdc in node crc\\\\nI0220 16:32:06.076103 6135 obj\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:24Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0220 16:32:24.946608 6352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 16:32:24.946783 6352 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.946805 6352 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 16:32:24.946783 6352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 16:32:24.947100 6352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0220 16:32:24.947247 6352 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 16:32:24.947308 6352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0220 16:32:24.947384 6352 factory.go:656] Stopping watch factory\\\\nI0220 16:32:24.947457 6352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0220 16:32:24.947465 6352 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947551 6352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0220 16:32:24.947039 6352 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947560 6352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0220 16:32:24.947591 6352 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.439320 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.464000 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.480781 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.480841 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.480860 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.480886 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.480905 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:25Z","lastTransitionTime":"2026-02-20T16:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.487830 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.515822 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.536896 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.569475 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.583666 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.583707 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.583719 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.583740 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.583754 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:25Z","lastTransitionTime":"2026-02-20T16:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.596643 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.614560 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.633529 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.655091 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.676152 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:25Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.686814 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.686861 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.686903 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.686932 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.686945 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:25Z","lastTransitionTime":"2026-02-20T16:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.790972 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.791027 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.791043 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.791070 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.791089 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:25Z","lastTransitionTime":"2026-02-20T16:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.851221 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 18:46:22.558888447 +0000 UTC Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.877215 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:25 crc kubenswrapper[4697]: E0220 16:32:25.877404 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.877761 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.877847 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:25 crc kubenswrapper[4697]: E0220 16:32:25.878095 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:25 crc kubenswrapper[4697]: E0220 16:32:25.878352 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.894235 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.894286 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.894314 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.894348 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:25 crc kubenswrapper[4697]: I0220 16:32:25.894377 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:25Z","lastTransitionTime":"2026-02-20T16:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.001822 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.001897 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.001917 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.001943 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.001963 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:26Z","lastTransitionTime":"2026-02-20T16:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.105873 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.105942 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.105960 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.105988 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.106006 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:26Z","lastTransitionTime":"2026-02-20T16:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.209365 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.209476 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.209498 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.209532 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.209555 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:26Z","lastTransitionTime":"2026-02-20T16:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.271841 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/2.log" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.279536 4697 scope.go:117] "RemoveContainer" containerID="fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a" Feb 20 16:32:26 crc kubenswrapper[4697]: E0220 16:32:26.279860 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.304535 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:26 crc kubenswrapper[4697]: E0220 16:32:26.304831 4697 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:26 crc kubenswrapper[4697]: E0220 16:32:26.304952 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs podName:0aff33f1-a871-41df-a6f1-fd7146e23a9c nodeName:}" failed. No retries permitted until 2026-02-20 16:32:42.304919659 +0000 UTC m=+70.084965097 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs") pod "network-metrics-daemon-nskrw" (UID: "0aff33f1-a871-41df-a6f1-fd7146e23a9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.308190 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.312793 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.312835 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.312855 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.312883 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.312903 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:26Z","lastTransitionTime":"2026-02-20T16:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.328352 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.365311 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.393502 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.417701 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.417777 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.417796 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.417835 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.417859 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:26Z","lastTransitionTime":"2026-02-20T16:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.418472 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.438702 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.460608 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.483836 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.502668 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.521170 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.521237 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.521256 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.521289 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.521312 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:26Z","lastTransitionTime":"2026-02-20T16:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.524369 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.543202 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.561748 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.578956 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.610112 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:24Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0220 16:32:24.946608 6352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 16:32:24.946783 6352 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.946805 6352 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 16:32:24.946783 6352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 16:32:24.947100 6352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0220 16:32:24.947247 6352 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 16:32:24.947308 6352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0220 16:32:24.947384 6352 factory.go:656] Stopping watch factory\\\\nI0220 16:32:24.947457 6352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0220 16:32:24.947465 6352 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947551 6352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0220 16:32:24.947039 6352 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947560 6352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0220 16:32:24.947591 6352 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.626413 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.626824 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.626990 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.627163 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.627313 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:26Z","lastTransitionTime":"2026-02-20T16:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.630621 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.654232 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.675191 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:26Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.730683 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.730757 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.730775 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.730803 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.730825 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:26Z","lastTransitionTime":"2026-02-20T16:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.837245 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.837349 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.837376 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.837413 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.837470 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:26Z","lastTransitionTime":"2026-02-20T16:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.851720 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 00:31:00.094069939 +0000 UTC Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.876351 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:26 crc kubenswrapper[4697]: E0220 16:32:26.876552 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.941199 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.941281 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.941301 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.941341 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:26 crc kubenswrapper[4697]: I0220 16:32:26.941372 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:26Z","lastTransitionTime":"2026-02-20T16:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.044656 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.044743 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.044768 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.044802 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.044825 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:27Z","lastTransitionTime":"2026-02-20T16:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.148604 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.148660 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.148670 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.148694 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.148710 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:27Z","lastTransitionTime":"2026-02-20T16:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.252504 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.252573 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.252593 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.252622 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.252645 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:27Z","lastTransitionTime":"2026-02-20T16:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.355676 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.355752 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.355780 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.355820 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.355845 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:27Z","lastTransitionTime":"2026-02-20T16:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.459521 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.459590 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.459608 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.459637 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.459657 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:27Z","lastTransitionTime":"2026-02-20T16:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.564450 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.564503 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.564516 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.564536 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.564549 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:27Z","lastTransitionTime":"2026-02-20T16:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.667262 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.667343 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.667362 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.667395 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.667418 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:27Z","lastTransitionTime":"2026-02-20T16:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.771334 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.771411 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.771461 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.771495 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.771518 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:27Z","lastTransitionTime":"2026-02-20T16:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.852819 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 18:52:46.407740184 +0000 UTC Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.875488 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.875563 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.875583 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.875619 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.875640 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:27Z","lastTransitionTime":"2026-02-20T16:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.876206 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.876259 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:27 crc kubenswrapper[4697]: E0220 16:32:27.876466 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:27 crc kubenswrapper[4697]: E0220 16:32:27.876539 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.876628 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:27 crc kubenswrapper[4697]: E0220 16:32:27.876764 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.972147 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.981955 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.982011 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.982024 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.982048 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.982063 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:27Z","lastTransitionTime":"2026-02-20T16:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.989638 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 20 16:32:27 crc kubenswrapper[4697]: I0220 16:32:27.997850 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:27Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.020407 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.045962 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.067529 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.085534 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.085627 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.085648 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.085689 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.085710 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:28Z","lastTransitionTime":"2026-02-20T16:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.091533 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.112574 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.136156 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.161836 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.182359 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.189234 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.189325 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.189347 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.189381 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.189403 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:28Z","lastTransitionTime":"2026-02-20T16:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.204649 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.240733 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:24Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0220 16:32:24.946608 6352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 16:32:24.946783 6352 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.946805 6352 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 16:32:24.946783 6352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 16:32:24.947100 6352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0220 16:32:24.947247 6352 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 16:32:24.947308 6352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0220 16:32:24.947384 6352 factory.go:656] Stopping watch factory\\\\nI0220 16:32:24.947457 6352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0220 16:32:24.947465 6352 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947551 6352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0220 16:32:24.947039 6352 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947560 6352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0220 16:32:24.947591 6352 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.259816 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.281662 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.292265 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.292343 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.292365 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.292403 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.292429 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:28Z","lastTransitionTime":"2026-02-20T16:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.304027 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.325235 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.341749 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.374463 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:28Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.396545 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.396611 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.396630 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.396661 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.396680 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:28Z","lastTransitionTime":"2026-02-20T16:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.499850 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.499917 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.499937 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.499968 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.499990 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:28Z","lastTransitionTime":"2026-02-20T16:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.602797 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.603539 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.603727 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.603887 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.604031 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:28Z","lastTransitionTime":"2026-02-20T16:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.707334 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.707400 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.707420 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.707488 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.707509 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:28Z","lastTransitionTime":"2026-02-20T16:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.811348 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.811406 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.811420 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.811467 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.811482 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:28Z","lastTransitionTime":"2026-02-20T16:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.853530 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 06:22:15.890279175 +0000 UTC Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.877107 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:28 crc kubenswrapper[4697]: E0220 16:32:28.877542 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.914651 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.914740 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.914762 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.914863 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:28 crc kubenswrapper[4697]: I0220 16:32:28.914897 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:28Z","lastTransitionTime":"2026-02-20T16:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.018656 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.018741 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.018764 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.018798 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.018849 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.092779 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.092903 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.092935 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.092981 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.093027 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: E0220 16:32:29.118329 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:29Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.125634 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.125697 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.125723 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.125756 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.125782 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: E0220 16:32:29.149926 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:29Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.156272 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.156602 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.157753 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.158004 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.158216 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: E0220 16:32:29.180773 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:29Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.186072 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.186285 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.186478 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.186652 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.186807 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: E0220 16:32:29.206231 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:29Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.212251 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.212511 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.212675 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.212820 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.212965 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: E0220 16:32:29.236569 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:29Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:29 crc kubenswrapper[4697]: E0220 16:32:29.237318 4697 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.239705 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.239763 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.239806 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.239831 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.239851 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.342788 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.342863 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.342883 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.342916 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.342939 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.446212 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.446268 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.446285 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.446308 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.446327 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.550292 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.550362 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.550380 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.550413 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.550472 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.654167 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.654231 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.654250 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.654279 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.654300 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.756791 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.756866 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.756886 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.756915 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.756941 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.854545 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 20:37:58.964112218 +0000 UTC Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.860076 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.860135 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.860152 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.860174 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.860186 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.876402 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.876538 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.876579 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:29 crc kubenswrapper[4697]: E0220 16:32:29.876805 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:29 crc kubenswrapper[4697]: E0220 16:32:29.876973 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:29 crc kubenswrapper[4697]: E0220 16:32:29.877174 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.963192 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.963243 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.963259 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.963281 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:29 crc kubenswrapper[4697]: I0220 16:32:29.963295 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:29Z","lastTransitionTime":"2026-02-20T16:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.067373 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.067477 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.067498 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.067532 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.067552 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:30Z","lastTransitionTime":"2026-02-20T16:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.178113 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.178176 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.178194 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.178224 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.178244 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:30Z","lastTransitionTime":"2026-02-20T16:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.282191 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.282270 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.282292 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.282321 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.282344 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:30Z","lastTransitionTime":"2026-02-20T16:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.391773 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.391852 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.391928 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.392020 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.392055 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:30Z","lastTransitionTime":"2026-02-20T16:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.496192 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.496259 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.496278 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.496311 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.496332 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:30Z","lastTransitionTime":"2026-02-20T16:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.600118 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.600206 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.600232 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.600266 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.600289 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:30Z","lastTransitionTime":"2026-02-20T16:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.703871 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.703968 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.703992 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.704025 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.704048 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:30Z","lastTransitionTime":"2026-02-20T16:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.807775 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.807855 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.807875 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.807910 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.807936 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:30Z","lastTransitionTime":"2026-02-20T16:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.855705 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 06:59:32.571226201 +0000 UTC Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.876613 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:30 crc kubenswrapper[4697]: E0220 16:32:30.876889 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.911469 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.911526 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.911546 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.911576 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:30 crc kubenswrapper[4697]: I0220 16:32:30.911600 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:30Z","lastTransitionTime":"2026-02-20T16:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.015509 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.015588 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.015610 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.015642 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.015665 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:31Z","lastTransitionTime":"2026-02-20T16:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.119359 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.119424 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.119484 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.119516 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.119545 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:31Z","lastTransitionTime":"2026-02-20T16:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.223076 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.223134 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.223153 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.223184 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.223205 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:31Z","lastTransitionTime":"2026-02-20T16:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.326880 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.326960 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.326988 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.327026 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.327051 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:31Z","lastTransitionTime":"2026-02-20T16:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.430787 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.430847 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.430869 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.430903 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.430926 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:31Z","lastTransitionTime":"2026-02-20T16:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.534503 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.534557 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.534579 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.534613 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.534636 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:31Z","lastTransitionTime":"2026-02-20T16:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.637849 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.637913 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.637926 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.637952 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.637969 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:31Z","lastTransitionTime":"2026-02-20T16:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.741551 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.741612 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.741625 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.741654 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.741671 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:31Z","lastTransitionTime":"2026-02-20T16:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.845554 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.845623 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.845644 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.845677 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.845700 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:31Z","lastTransitionTime":"2026-02-20T16:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.856146 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 11:20:41.683136494 +0000 UTC Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.876679 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.876765 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.876770 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:31 crc kubenswrapper[4697]: E0220 16:32:31.876936 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:31 crc kubenswrapper[4697]: E0220 16:32:31.877076 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:31 crc kubenswrapper[4697]: E0220 16:32:31.877234 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.949518 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.949705 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.949766 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.949799 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:31 crc kubenswrapper[4697]: I0220 16:32:31.949825 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:31Z","lastTransitionTime":"2026-02-20T16:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.053045 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.053115 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.053136 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.053170 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.053192 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:32Z","lastTransitionTime":"2026-02-20T16:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.156547 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.156636 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.156665 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.156705 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.156733 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:32Z","lastTransitionTime":"2026-02-20T16:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.259964 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.260035 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.260065 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.260106 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.260129 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:32Z","lastTransitionTime":"2026-02-20T16:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.363361 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.363428 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.363477 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.363507 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.363530 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:32Z","lastTransitionTime":"2026-02-20T16:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.466359 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.466418 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.466479 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.466513 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.466539 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:32Z","lastTransitionTime":"2026-02-20T16:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.570357 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.570473 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.570493 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.570524 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.570545 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:32Z","lastTransitionTime":"2026-02-20T16:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.673598 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.673673 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.673692 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.673722 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.673742 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:32Z","lastTransitionTime":"2026-02-20T16:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.777791 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.777897 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.777926 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.777966 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.777991 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:32Z","lastTransitionTime":"2026-02-20T16:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.857056 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 01:50:55.438101698 +0000 UTC Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.876857 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:32 crc kubenswrapper[4697]: E0220 16:32:32.877102 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.886742 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.886795 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.886814 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.886844 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.886865 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:32Z","lastTransitionTime":"2026-02-20T16:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.921740 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:32Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.944516 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:32Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.964418 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:32Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.991901 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.991963 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.991982 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.992017 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.992039 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:32Z","lastTransitionTime":"2026-02-20T16:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:32 crc kubenswrapper[4697]: I0220 16:32:32.996010 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:32Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.017775 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.040666 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.063078 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.081611 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.096952 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.097029 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.097056 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.097090 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.097116 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:33Z","lastTransitionTime":"2026-02-20T16:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.101521 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.126664 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.146599 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.165388 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.187877 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d76d5561-c9fc-4377-b764-5a6856eada68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b231279d61bb858c2d662fd41b388d450db0ed9f92b55f968a334a2ce2b50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1adcc43d1a375bc06218f7d3c94c564132a5f3dd5cde0c7ee1f86883b8100552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b78299504f4a4555d5fa6bd331589ed16effa4428034951de3d8f83ce652780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.203998 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.204094 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.204117 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.204178 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.204201 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:33Z","lastTransitionTime":"2026-02-20T16:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.207778 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.231680 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.250277 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.272056 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.306551 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:24Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0220 16:32:24.946608 6352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 16:32:24.946783 6352 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.946805 6352 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 16:32:24.946783 6352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 16:32:24.947100 6352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0220 16:32:24.947247 6352 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 16:32:24.947308 6352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0220 16:32:24.947384 6352 factory.go:656] Stopping watch factory\\\\nI0220 16:32:24.947457 6352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0220 16:32:24.947465 6352 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947551 6352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0220 16:32:24.947039 6352 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947560 6352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0220 16:32:24.947591 6352 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:33Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.308450 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.308516 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.308536 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.308565 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.308581 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:33Z","lastTransitionTime":"2026-02-20T16:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.417636 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.417726 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.417757 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.417793 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.417813 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:33Z","lastTransitionTime":"2026-02-20T16:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.522308 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.522423 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.522506 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.522544 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.522570 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:33Z","lastTransitionTime":"2026-02-20T16:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.626153 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.626247 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.626269 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.626301 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.626337 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:33Z","lastTransitionTime":"2026-02-20T16:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.729877 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.729960 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.729981 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.730006 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.730024 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:33Z","lastTransitionTime":"2026-02-20T16:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.833894 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.833998 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.834026 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.834066 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.834097 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:33Z","lastTransitionTime":"2026-02-20T16:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.857264 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 06:37:51.268765732 +0000 UTC Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.877151 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.877157 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:33 crc kubenswrapper[4697]: E0220 16:32:33.877397 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:33 crc kubenswrapper[4697]: E0220 16:32:33.877550 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.877165 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:33 crc kubenswrapper[4697]: E0220 16:32:33.877708 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.937756 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.938033 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.938209 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.938371 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:33 crc kubenswrapper[4697]: I0220 16:32:33.938594 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:33Z","lastTransitionTime":"2026-02-20T16:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.043211 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.043611 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.043758 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.043906 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.044057 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:34Z","lastTransitionTime":"2026-02-20T16:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.146844 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.146941 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.146963 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.146999 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.147020 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:34Z","lastTransitionTime":"2026-02-20T16:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.251626 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.251693 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.251719 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.251756 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.251782 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:34Z","lastTransitionTime":"2026-02-20T16:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.354802 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.354879 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.354899 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.354928 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.354948 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:34Z","lastTransitionTime":"2026-02-20T16:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.458136 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.458199 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.458218 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.458250 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.458273 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:34Z","lastTransitionTime":"2026-02-20T16:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.562472 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.562538 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.562561 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.562597 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.562623 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:34Z","lastTransitionTime":"2026-02-20T16:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.665940 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.666003 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.666022 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.666052 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.666073 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:34Z","lastTransitionTime":"2026-02-20T16:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.770251 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.770306 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.770324 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.770355 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.770375 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:34Z","lastTransitionTime":"2026-02-20T16:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.857709 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 00:45:16.246774401 +0000 UTC Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.873795 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.873894 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.873914 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.873957 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.874031 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:34Z","lastTransitionTime":"2026-02-20T16:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.876278 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:34 crc kubenswrapper[4697]: E0220 16:32:34.876555 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.978000 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.978097 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.978120 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.978149 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:34 crc kubenswrapper[4697]: I0220 16:32:34.978172 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:34Z","lastTransitionTime":"2026-02-20T16:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.082646 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.082734 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.082761 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.082798 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.082825 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:35Z","lastTransitionTime":"2026-02-20T16:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.187927 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.188000 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.188022 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.188059 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.188086 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:35Z","lastTransitionTime":"2026-02-20T16:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.292324 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.292415 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.292486 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.292526 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.292556 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:35Z","lastTransitionTime":"2026-02-20T16:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.396621 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.396679 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.396697 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.396728 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.396748 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:35Z","lastTransitionTime":"2026-02-20T16:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.500255 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.500341 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.500366 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.500403 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.500427 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:35Z","lastTransitionTime":"2026-02-20T16:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.604722 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.604789 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.604807 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.604838 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.604865 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:35Z","lastTransitionTime":"2026-02-20T16:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.708351 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.708429 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.708489 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.708521 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.708544 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:35Z","lastTransitionTime":"2026-02-20T16:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.811942 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.811992 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.812009 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.812040 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.812059 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:35Z","lastTransitionTime":"2026-02-20T16:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.858748 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 22:58:33.868971296 +0000 UTC Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.876029 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:35 crc kubenswrapper[4697]: E0220 16:32:35.876233 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.876910 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:35 crc kubenswrapper[4697]: E0220 16:32:35.877060 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.877163 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:35 crc kubenswrapper[4697]: E0220 16:32:35.877260 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.916284 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.916355 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.916379 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.916413 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:35 crc kubenswrapper[4697]: I0220 16:32:35.916463 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:35Z","lastTransitionTime":"2026-02-20T16:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.019246 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.019296 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.019314 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.019341 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.019361 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:36Z","lastTransitionTime":"2026-02-20T16:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.122383 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.122495 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.122510 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.122530 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.122545 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:36Z","lastTransitionTime":"2026-02-20T16:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.225219 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.225287 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.225308 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.225342 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.225363 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:36Z","lastTransitionTime":"2026-02-20T16:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.328042 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.328092 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.328104 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.328130 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.328143 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:36Z","lastTransitionTime":"2026-02-20T16:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.431666 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.431734 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.431750 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.431774 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.431789 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:36Z","lastTransitionTime":"2026-02-20T16:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.534151 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.534197 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.534210 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.534228 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.534241 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:36Z","lastTransitionTime":"2026-02-20T16:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.637275 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.637323 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.637336 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.637357 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.637369 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:36Z","lastTransitionTime":"2026-02-20T16:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.740065 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.740110 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.740122 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.740144 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.740156 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:36Z","lastTransitionTime":"2026-02-20T16:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.843963 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.844033 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.844051 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.844083 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.844103 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:36Z","lastTransitionTime":"2026-02-20T16:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.859377 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 22:23:02.72307727 +0000 UTC Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.876232 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:36 crc kubenswrapper[4697]: E0220 16:32:36.876386 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.947443 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.947489 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.947503 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.947524 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:36 crc kubenswrapper[4697]: I0220 16:32:36.947537 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:36Z","lastTransitionTime":"2026-02-20T16:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.050475 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.050517 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.050526 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.050544 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.050555 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:37Z","lastTransitionTime":"2026-02-20T16:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.153239 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.153290 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.153299 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.153317 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.153327 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:37Z","lastTransitionTime":"2026-02-20T16:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.256417 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.256486 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.256499 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.256518 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.256528 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:37Z","lastTransitionTime":"2026-02-20T16:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.359193 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.359285 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.359307 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.359344 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.359365 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:37Z","lastTransitionTime":"2026-02-20T16:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.461944 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.461987 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.461999 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.462018 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.462028 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:37Z","lastTransitionTime":"2026-02-20T16:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.564262 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.564302 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.564314 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.564330 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.564339 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:37Z","lastTransitionTime":"2026-02-20T16:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.666896 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.666940 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.666950 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.666966 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.666978 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:37Z","lastTransitionTime":"2026-02-20T16:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.768905 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.768943 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.768952 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.768967 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.768976 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:37Z","lastTransitionTime":"2026-02-20T16:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.859881 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 14:28:46.6777306 +0000 UTC Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.872207 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.872252 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.872260 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.872278 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.872289 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:37Z","lastTransitionTime":"2026-02-20T16:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.876642 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:37 crc kubenswrapper[4697]: E0220 16:32:37.876748 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.876799 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.876960 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:37 crc kubenswrapper[4697]: E0220 16:32:37.877298 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:37 crc kubenswrapper[4697]: E0220 16:32:37.877405 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.974893 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.974938 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.974951 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.974969 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:37 crc kubenswrapper[4697]: I0220 16:32:37.974979 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:37Z","lastTransitionTime":"2026-02-20T16:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.077913 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.077947 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.077954 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.077970 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.077978 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:38Z","lastTransitionTime":"2026-02-20T16:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.181731 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.181821 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.181846 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.181881 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.181905 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:38Z","lastTransitionTime":"2026-02-20T16:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.285245 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.285294 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.285304 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.285326 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.285338 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:38Z","lastTransitionTime":"2026-02-20T16:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.387802 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.387863 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.387880 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.387902 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.387921 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:38Z","lastTransitionTime":"2026-02-20T16:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.491573 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.491637 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.491650 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.491679 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.491692 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:38Z","lastTransitionTime":"2026-02-20T16:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.593321 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.593369 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.593383 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.593403 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.593419 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:38Z","lastTransitionTime":"2026-02-20T16:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.696488 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.696546 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.696559 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.696583 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.696597 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:38Z","lastTransitionTime":"2026-02-20T16:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.799649 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.799699 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.799709 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.799733 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.799746 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:38Z","lastTransitionTime":"2026-02-20T16:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.860100 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 05:02:24.968570034 +0000 UTC Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.876778 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:38 crc kubenswrapper[4697]: E0220 16:32:38.877035 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.902500 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.902558 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.902577 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.902603 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:38 crc kubenswrapper[4697]: I0220 16:32:38.902625 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:38Z","lastTransitionTime":"2026-02-20T16:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.005590 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.005652 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.005674 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.005703 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.005725 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.109046 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.109176 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.109196 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.109231 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.109252 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.212192 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.212658 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.212824 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.212987 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.213126 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.316260 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.316326 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.316344 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.316372 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.316391 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.419380 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.419465 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.419485 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.419513 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.419532 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.523101 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.523165 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.523185 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.523229 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.523252 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.587428 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.587549 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.587570 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.587604 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.587625 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: E0220 16:32:39.602875 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:39Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.607650 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.607752 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.607772 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.607834 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.607856 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: E0220 16:32:39.623809 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:39Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.628676 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.628736 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.628761 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.628792 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.628812 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: E0220 16:32:39.646788 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:39Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.651082 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.651194 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.651214 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.651283 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.651308 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: E0220 16:32:39.666992 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:39Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.676690 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.676766 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.676784 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.676809 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.676831 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: E0220 16:32:39.694944 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:39Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:39 crc kubenswrapper[4697]: E0220 16:32:39.695318 4697 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.697447 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.697515 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.697529 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.697553 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.697571 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.799905 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.799948 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.799959 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.799978 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.799991 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.860288 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 00:54:13.967207497 +0000 UTC Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.876743 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.876795 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:39 crc kubenswrapper[4697]: E0220 16:32:39.876910 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.877049 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:39 crc kubenswrapper[4697]: E0220 16:32:39.877735 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.878624 4697 scope.go:117] "RemoveContainer" containerID="fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a" Feb 20 16:32:39 crc kubenswrapper[4697]: E0220 16:32:39.878628 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:39 crc kubenswrapper[4697]: E0220 16:32:39.878935 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.903114 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.903145 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.903157 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.903178 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:39 crc kubenswrapper[4697]: I0220 16:32:39.903190 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:39Z","lastTransitionTime":"2026-02-20T16:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.005896 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.005946 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.005966 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.005992 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.006010 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:40Z","lastTransitionTime":"2026-02-20T16:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.108663 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.108723 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.108743 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.108767 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.108788 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:40Z","lastTransitionTime":"2026-02-20T16:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.212047 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.212116 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.212129 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.212153 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.212165 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:40Z","lastTransitionTime":"2026-02-20T16:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.315045 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.315115 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.315126 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.315151 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.315163 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:40Z","lastTransitionTime":"2026-02-20T16:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.418902 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.418990 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.419006 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.419056 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.419073 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:40Z","lastTransitionTime":"2026-02-20T16:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.521485 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.521544 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.521556 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.521579 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.521593 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:40Z","lastTransitionTime":"2026-02-20T16:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.623682 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.623746 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.623766 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.623794 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.623812 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:40Z","lastTransitionTime":"2026-02-20T16:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.726608 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.726687 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.726701 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.726719 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.726735 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:40Z","lastTransitionTime":"2026-02-20T16:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.829292 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.829383 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.829408 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.829481 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.829506 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:40Z","lastTransitionTime":"2026-02-20T16:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.860412 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 04:01:16.299162041 +0000 UTC Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.877760 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:40 crc kubenswrapper[4697]: E0220 16:32:40.877903 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.932559 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.932603 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.932614 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.932632 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:40 crc kubenswrapper[4697]: I0220 16:32:40.932643 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:40Z","lastTransitionTime":"2026-02-20T16:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.035119 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.035181 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.035206 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.035238 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.035258 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:41Z","lastTransitionTime":"2026-02-20T16:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.137489 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.137515 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.137526 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.137540 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.137553 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:41Z","lastTransitionTime":"2026-02-20T16:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.239656 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.239719 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.239730 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.239755 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.239772 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:41Z","lastTransitionTime":"2026-02-20T16:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.341874 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.341913 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.341924 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.341947 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.341958 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:41Z","lastTransitionTime":"2026-02-20T16:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.444463 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.444530 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.444544 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.444572 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.444589 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:41Z","lastTransitionTime":"2026-02-20T16:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.546850 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.546895 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.546906 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.546923 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.546938 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:41Z","lastTransitionTime":"2026-02-20T16:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.649305 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.649356 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.649367 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.649385 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.649402 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:41Z","lastTransitionTime":"2026-02-20T16:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.751968 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.752005 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.752014 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.752027 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.752035 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:41Z","lastTransitionTime":"2026-02-20T16:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.855207 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.855276 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.855286 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.855302 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.855317 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:41Z","lastTransitionTime":"2026-02-20T16:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.861388 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 16:04:46.307244801 +0000 UTC Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.876941 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:41 crc kubenswrapper[4697]: E0220 16:32:41.877101 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.877567 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.877693 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:41 crc kubenswrapper[4697]: E0220 16:32:41.877902 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:41 crc kubenswrapper[4697]: E0220 16:32:41.878907 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.958901 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.958959 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.958973 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.958997 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:41 crc kubenswrapper[4697]: I0220 16:32:41.959012 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:41Z","lastTransitionTime":"2026-02-20T16:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.063655 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.063763 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.063782 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.063816 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.063842 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:42Z","lastTransitionTime":"2026-02-20T16:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.166850 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.166923 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.166935 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.166954 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.166984 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:42Z","lastTransitionTime":"2026-02-20T16:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.269531 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.269564 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.269573 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.269591 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.269601 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:42Z","lastTransitionTime":"2026-02-20T16:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.333357 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:42 crc kubenswrapper[4697]: E0220 16:32:42.333678 4697 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:42 crc kubenswrapper[4697]: E0220 16:32:42.333784 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs podName:0aff33f1-a871-41df-a6f1-fd7146e23a9c nodeName:}" failed. No retries permitted until 2026-02-20 16:33:14.333753766 +0000 UTC m=+102.113799204 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs") pod "network-metrics-daemon-nskrw" (UID: "0aff33f1-a871-41df-a6f1-fd7146e23a9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.372099 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.372168 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.372188 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.372218 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.372235 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:42Z","lastTransitionTime":"2026-02-20T16:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.475485 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.475557 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.475576 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.475608 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.475627 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:42Z","lastTransitionTime":"2026-02-20T16:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.578816 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.578887 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.578905 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.578937 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.578956 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:42Z","lastTransitionTime":"2026-02-20T16:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.682084 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.682185 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.682213 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.682249 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.682275 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:42Z","lastTransitionTime":"2026-02-20T16:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.785117 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.785164 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.785176 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.785196 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.785208 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:42Z","lastTransitionTime":"2026-02-20T16:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.861546 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 06:20:30.509368504 +0000 UTC Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.876475 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:42 crc kubenswrapper[4697]: E0220 16:32:42.876725 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.887788 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.887843 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.887864 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.887891 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.887913 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:42Z","lastTransitionTime":"2026-02-20T16:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.899910 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:42Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.917289 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:42Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.948252 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:42Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.969650 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:42Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.992853 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:42Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.995923 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.996107 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.996538 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.997078 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:42 crc kubenswrapper[4697]: I0220 16:32:42.997550 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:42Z","lastTransitionTime":"2026-02-20T16:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.007666 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.022766 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.036479 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.050796 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.072994 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.087684 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.100270 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.100316 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.100336 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.100364 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.100385 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:43Z","lastTransitionTime":"2026-02-20T16:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.102577 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.116782 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.147733 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:24Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0220 16:32:24.946608 6352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 16:32:24.946783 6352 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.946805 6352 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 16:32:24.946783 6352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 16:32:24.947100 6352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0220 16:32:24.947247 6352 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 16:32:24.947308 6352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0220 16:32:24.947384 6352 factory.go:656] Stopping watch factory\\\\nI0220 16:32:24.947457 6352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0220 16:32:24.947465 6352 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947551 6352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0220 16:32:24.947039 6352 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947560 6352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0220 16:32:24.947591 6352 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.161483 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.175394 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d76d5561-c9fc-4377-b764-5a6856eada68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b231279d61bb858c2d662fd41b388d450db0ed9f92b55f968a334a2ce2b50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1adcc43d1a375bc06218f7d3c94c564132a5f3dd5cde0c7ee1f86883b8100552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b78299504f4a4555d5fa6bd331589ed16effa4428034951de3d8f83ce652780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.191515 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.203405 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.203515 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.203543 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.203578 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.203608 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:43Z","lastTransitionTime":"2026-02-20T16:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.215508 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:43Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.306614 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.306701 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.306723 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.306754 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.306775 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:43Z","lastTransitionTime":"2026-02-20T16:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.410334 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.410409 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.410422 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.410469 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.410484 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:43Z","lastTransitionTime":"2026-02-20T16:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.513168 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.513214 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.513225 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.513244 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.513260 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:43Z","lastTransitionTime":"2026-02-20T16:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.616703 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.616768 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.616789 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.616819 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.616838 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:43Z","lastTransitionTime":"2026-02-20T16:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.719357 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.719422 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.719453 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.719478 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.719521 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:43Z","lastTransitionTime":"2026-02-20T16:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.823182 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.823225 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.823235 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.823252 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.823263 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:43Z","lastTransitionTime":"2026-02-20T16:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.862585 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 14:55:10.288509514 +0000 UTC Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.877237 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.877298 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.877352 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:43 crc kubenswrapper[4697]: E0220 16:32:43.877624 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:43 crc kubenswrapper[4697]: E0220 16:32:43.877724 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:43 crc kubenswrapper[4697]: E0220 16:32:43.877865 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.925686 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.926030 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.926130 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.926238 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:43 crc kubenswrapper[4697]: I0220 16:32:43.926321 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:43Z","lastTransitionTime":"2026-02-20T16:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.029656 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.029693 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.029704 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.029718 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.029727 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:44Z","lastTransitionTime":"2026-02-20T16:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.132593 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.132659 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.132674 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.132692 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.132704 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:44Z","lastTransitionTime":"2026-02-20T16:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.235271 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.235544 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.235619 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.235704 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.235985 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:44Z","lastTransitionTime":"2026-02-20T16:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.338552 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.338832 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.338931 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.339025 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.339095 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:44Z","lastTransitionTime":"2026-02-20T16:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.356608 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lrpxf_1de5dc4e-ef42-48fc-be23-eaec2039c031/kube-multus/0.log" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.356994 4697 generic.go:334] "Generic (PLEG): container finished" podID="1de5dc4e-ef42-48fc-be23-eaec2039c031" containerID="d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280" exitCode=1 Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.357095 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lrpxf" event={"ID":"1de5dc4e-ef42-48fc-be23-eaec2039c031","Type":"ContainerDied","Data":"d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280"} Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.357469 4697 scope.go:117] "RemoveContainer" containerID="d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.370894 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d76d5561-c9fc-4377-b764-5a6856eada68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b231279d61bb858c2d662fd41b388d450db0ed9f92b55f968a334a2ce2b50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1adcc43d1a375bc06218f7d3c94c564132a5f3dd5cde0c7ee1f86883b8100552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b78299504f4a4555d5fa6bd331589ed16effa4428034951de3d8f83ce652780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.386534 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.400002 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.410985 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.427976 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:43Z\\\",\\\"message\\\":\\\"2026-02-20T16:31:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b\\\\n2026-02-20T16:31:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b to /host/opt/cni/bin/\\\\n2026-02-20T16:31:58Z [verbose] multus-daemon started\\\\n2026-02-20T16:31:58Z [verbose] Readiness Indicator file check\\\\n2026-02-20T16:32:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.441663 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.441711 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.441721 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.441745 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.441758 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:44Z","lastTransitionTime":"2026-02-20T16:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.456049 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:24Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0220 16:32:24.946608 6352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 16:32:24.946783 6352 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.946805 6352 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 16:32:24.946783 6352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 16:32:24.947100 6352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0220 16:32:24.947247 6352 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 16:32:24.947308 6352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0220 16:32:24.947384 6352 factory.go:656] Stopping watch factory\\\\nI0220 16:32:24.947457 6352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0220 16:32:24.947465 6352 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947551 6352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0220 16:32:24.947039 6352 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947560 6352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0220 16:32:24.947591 6352 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.472037 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.495855 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.511138 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.529286 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.544832 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.546269 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.546312 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.546327 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.546352 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.546421 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:44Z","lastTransitionTime":"2026-02-20T16:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.556821 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.568950 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.581130 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.593275 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.603733 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.616735 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.633057 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:44Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.649422 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.649627 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.649648 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.649682 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.649704 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:44Z","lastTransitionTime":"2026-02-20T16:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.752925 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.752968 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.752980 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.753003 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.753016 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:44Z","lastTransitionTime":"2026-02-20T16:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.857111 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.857181 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.857206 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.857244 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.857268 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:44Z","lastTransitionTime":"2026-02-20T16:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.863895 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 04:01:48.816312632 +0000 UTC Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.876799 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:44 crc kubenswrapper[4697]: E0220 16:32:44.876982 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.969489 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.969572 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.969597 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.969632 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:44 crc kubenswrapper[4697]: I0220 16:32:44.969655 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:44Z","lastTransitionTime":"2026-02-20T16:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.072700 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.072743 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.072754 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.072775 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.072787 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:45Z","lastTransitionTime":"2026-02-20T16:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.175426 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.175534 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.175558 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.175586 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.175605 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:45Z","lastTransitionTime":"2026-02-20T16:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.278525 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.278565 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.278579 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.278598 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.278611 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:45Z","lastTransitionTime":"2026-02-20T16:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.362939 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lrpxf_1de5dc4e-ef42-48fc-be23-eaec2039c031/kube-multus/0.log" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.363005 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lrpxf" event={"ID":"1de5dc4e-ef42-48fc-be23-eaec2039c031","Type":"ContainerStarted","Data":"a8b038b9ead0bc9a97b50c6f4c8bc6e710b43746fc631bec4a60f4514fc68175"} Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.380007 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.380614 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.380655 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.380668 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.380696 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.380711 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:45Z","lastTransitionTime":"2026-02-20T16:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.393414 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.403307 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.415029 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.433554 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.450210 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.468135 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.483987 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.484036 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.484051 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.484074 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.484090 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:45Z","lastTransitionTime":"2026-02-20T16:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.486247 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d76d5561-c9fc-4377-b764-5a6856eada68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b231279d61bb858c2d662fd41b388d450db0ed9f92b55f968a334a2ce2b50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1adcc43d1a375bc06218f7d3c94c564132a5f3dd5cde0c7ee1f86883b8100552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b78299504f4a4555d5fa6bd331589ed16effa4428034951de3d8f83ce652780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.504722 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.524137 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.535158 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.552162 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8b038b9ead0bc9a97b50c6f4c8bc6e710b43746fc631bec4a60f4514fc68175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:43Z\\\",\\\"message\\\":\\\"2026-02-20T16:31:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b\\\\n2026-02-20T16:31:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b to /host/opt/cni/bin/\\\\n2026-02-20T16:31:58Z [verbose] multus-daemon started\\\\n2026-02-20T16:31:58Z [verbose] Readiness Indicator file check\\\\n2026-02-20T16:32:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.571216 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:24Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0220 16:32:24.946608 6352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 16:32:24.946783 6352 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.946805 6352 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 16:32:24.946783 6352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 16:32:24.947100 6352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0220 16:32:24.947247 6352 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 16:32:24.947308 6352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0220 16:32:24.947384 6352 factory.go:656] Stopping watch factory\\\\nI0220 16:32:24.947457 6352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0220 16:32:24.947465 6352 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947551 6352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0220 16:32:24.947039 6352 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947560 6352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0220 16:32:24.947591 6352 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.587119 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.587187 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.587200 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.587224 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.587240 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:45Z","lastTransitionTime":"2026-02-20T16:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.597267 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.615082 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.637117 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.659366 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.675068 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:45Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.690210 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.690248 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.690259 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.690280 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.690292 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:45Z","lastTransitionTime":"2026-02-20T16:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.794172 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.794224 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.794235 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.794253 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.794265 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:45Z","lastTransitionTime":"2026-02-20T16:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.864493 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 11:36:36.679880523 +0000 UTC Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.876957 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:45 crc kubenswrapper[4697]: E0220 16:32:45.877133 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.877401 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.877513 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:45 crc kubenswrapper[4697]: E0220 16:32:45.877542 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:45 crc kubenswrapper[4697]: E0220 16:32:45.877700 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.896912 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.896943 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.896953 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.896972 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:45 crc kubenswrapper[4697]: I0220 16:32:45.896985 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:45Z","lastTransitionTime":"2026-02-20T16:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.000234 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.000286 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.000300 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.000321 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.000334 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:46Z","lastTransitionTime":"2026-02-20T16:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.102887 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.102930 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.102943 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.102960 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.102973 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:46Z","lastTransitionTime":"2026-02-20T16:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.206830 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.206897 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.206918 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.206950 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.206971 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:46Z","lastTransitionTime":"2026-02-20T16:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.310067 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.310143 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.310166 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.310196 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.310220 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:46Z","lastTransitionTime":"2026-02-20T16:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.413212 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.413276 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.413295 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.413321 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.413343 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:46Z","lastTransitionTime":"2026-02-20T16:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.517228 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.517293 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.517309 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.517335 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.517356 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:46Z","lastTransitionTime":"2026-02-20T16:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.620756 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.620801 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.620810 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.620827 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.620840 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:46Z","lastTransitionTime":"2026-02-20T16:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.724880 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.724914 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.724923 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.724939 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.724953 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:46Z","lastTransitionTime":"2026-02-20T16:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.828230 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.828303 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.828326 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.828364 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.828387 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:46Z","lastTransitionTime":"2026-02-20T16:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.865686 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 11:37:03.895401403 +0000 UTC Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.877874 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:46 crc kubenswrapper[4697]: E0220 16:32:46.878174 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.932005 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.932072 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.932092 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.932129 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:46 crc kubenswrapper[4697]: I0220 16:32:46.932155 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:46Z","lastTransitionTime":"2026-02-20T16:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.035754 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.035818 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.035831 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.035864 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.035879 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:47Z","lastTransitionTime":"2026-02-20T16:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.139914 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.139983 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.140002 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.140030 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.140044 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:47Z","lastTransitionTime":"2026-02-20T16:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.244365 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.244401 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.244413 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.244428 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.244456 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:47Z","lastTransitionTime":"2026-02-20T16:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.348496 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.348601 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.348647 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.348693 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.348719 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:47Z","lastTransitionTime":"2026-02-20T16:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.452599 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.452664 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.452685 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.452713 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.452733 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:47Z","lastTransitionTime":"2026-02-20T16:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.556910 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.556972 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.556989 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.557018 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.557034 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:47Z","lastTransitionTime":"2026-02-20T16:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.660508 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.660571 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.660584 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.660608 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.660623 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:47Z","lastTransitionTime":"2026-02-20T16:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.776803 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.776878 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.776893 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.776918 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.776934 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:47Z","lastTransitionTime":"2026-02-20T16:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.866738 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 00:57:57.799825436 +0000 UTC Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.876278 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.876380 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.876478 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:47 crc kubenswrapper[4697]: E0220 16:32:47.876524 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:47 crc kubenswrapper[4697]: E0220 16:32:47.876618 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:47 crc kubenswrapper[4697]: E0220 16:32:47.876770 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.879905 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.879939 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.879955 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.879976 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.879989 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:47Z","lastTransitionTime":"2026-02-20T16:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.983731 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.983790 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.983810 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.983840 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:47 crc kubenswrapper[4697]: I0220 16:32:47.983860 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:47Z","lastTransitionTime":"2026-02-20T16:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.086764 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.086830 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.086850 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.086873 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.086886 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:48Z","lastTransitionTime":"2026-02-20T16:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.188826 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.188869 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.188881 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.188899 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.189232 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:48Z","lastTransitionTime":"2026-02-20T16:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.293516 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.293560 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.293574 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.293591 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.293603 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:48Z","lastTransitionTime":"2026-02-20T16:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.396392 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.396471 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.396484 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.396501 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.396510 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:48Z","lastTransitionTime":"2026-02-20T16:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.499637 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.499728 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.499760 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.499798 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.499826 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:48Z","lastTransitionTime":"2026-02-20T16:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.602533 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.602608 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.602633 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.602669 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.602693 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:48Z","lastTransitionTime":"2026-02-20T16:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.706290 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.706348 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.706369 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.706391 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.706407 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:48Z","lastTransitionTime":"2026-02-20T16:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.810169 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.810247 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.810277 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.810318 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.810345 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:48Z","lastTransitionTime":"2026-02-20T16:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.867289 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 20:33:12.217968024 +0000 UTC Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.876779 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:48 crc kubenswrapper[4697]: E0220 16:32:48.876996 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.914093 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.914153 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.914177 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.914206 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:48 crc kubenswrapper[4697]: I0220 16:32:48.914228 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:48Z","lastTransitionTime":"2026-02-20T16:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.017297 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.017358 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.017381 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.017415 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.017476 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.121007 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.121073 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.121099 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.121136 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.121162 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.224331 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.224390 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.224412 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.224479 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.224511 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.328211 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.328277 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.328296 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.328326 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.328346 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.431426 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.431593 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.431623 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.431654 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.431676 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.535117 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.535177 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.535196 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.535223 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.535246 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.638806 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.638872 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.638903 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.638936 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.638959 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.701384 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.701451 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.701464 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.701490 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.701504 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: E0220 16:32:49.720551 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:49Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.725861 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.725889 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.725897 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.725915 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.725933 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: E0220 16:32:49.742643 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:49Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.746969 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.747017 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.747028 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.747046 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.747063 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: E0220 16:32:49.759782 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:49Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.764245 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.764344 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.764365 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.764423 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.764497 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: E0220 16:32:49.785141 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:49Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.790158 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.790201 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.790235 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.790273 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.790285 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: E0220 16:32:49.805730 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:49Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:49 crc kubenswrapper[4697]: E0220 16:32:49.805959 4697 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.808827 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.808879 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.808898 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.808926 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.808946 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.867931 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 22:03:20.440409071 +0000 UTC Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.876401 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.876573 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:49 crc kubenswrapper[4697]: E0220 16:32:49.876673 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:49 crc kubenswrapper[4697]: E0220 16:32:49.876799 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.877194 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:49 crc kubenswrapper[4697]: E0220 16:32:49.877315 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.912286 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.912357 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.912378 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.912407 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:49 crc kubenswrapper[4697]: I0220 16:32:49.912426 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:49Z","lastTransitionTime":"2026-02-20T16:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.016113 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.016178 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.016196 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.016228 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.016251 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:50Z","lastTransitionTime":"2026-02-20T16:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.119708 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.119774 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.119798 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.119829 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.119849 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:50Z","lastTransitionTime":"2026-02-20T16:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.223596 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.223681 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.223709 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.223747 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.223776 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:50Z","lastTransitionTime":"2026-02-20T16:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.325827 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.325859 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.325868 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.325883 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.325894 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:50Z","lastTransitionTime":"2026-02-20T16:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.428684 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.428752 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.428767 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.428794 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.428808 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:50Z","lastTransitionTime":"2026-02-20T16:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.532571 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.532638 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.532658 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.532691 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.532713 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:50Z","lastTransitionTime":"2026-02-20T16:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.635780 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.635893 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.635924 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.635969 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.636000 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:50Z","lastTransitionTime":"2026-02-20T16:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.740155 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.740229 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.740250 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.740282 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.740304 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:50Z","lastTransitionTime":"2026-02-20T16:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.843511 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.843598 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.843617 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.843649 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.843668 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:50Z","lastTransitionTime":"2026-02-20T16:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.868960 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 07:10:32.46197951 +0000 UTC Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.876529 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:50 crc kubenswrapper[4697]: E0220 16:32:50.876771 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.946921 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.946983 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.947002 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.947038 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:50 crc kubenswrapper[4697]: I0220 16:32:50.947060 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:50Z","lastTransitionTime":"2026-02-20T16:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.051051 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.051114 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.051135 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.051165 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.051187 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:51Z","lastTransitionTime":"2026-02-20T16:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.154845 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.154917 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.154941 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.154976 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.154999 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:51Z","lastTransitionTime":"2026-02-20T16:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.258323 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.258383 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.258403 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.258465 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.258487 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:51Z","lastTransitionTime":"2026-02-20T16:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.361954 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.362025 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.362051 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.362087 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.362111 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:51Z","lastTransitionTime":"2026-02-20T16:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.465353 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.465472 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.465498 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.465533 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.465558 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:51Z","lastTransitionTime":"2026-02-20T16:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.569317 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.569399 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.569419 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.569511 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.569540 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:51Z","lastTransitionTime":"2026-02-20T16:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.673340 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.673421 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.673472 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.673508 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.673530 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:51Z","lastTransitionTime":"2026-02-20T16:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.777463 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.777517 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.777529 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.777550 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.777562 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:51Z","lastTransitionTime":"2026-02-20T16:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.870139 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 17:26:36.379499362 +0000 UTC Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.876597 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.876651 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.876608 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:51 crc kubenswrapper[4697]: E0220 16:32:51.876824 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:51 crc kubenswrapper[4697]: E0220 16:32:51.876955 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:51 crc kubenswrapper[4697]: E0220 16:32:51.877072 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.881623 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.881699 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.881767 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.881800 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.881823 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:51Z","lastTransitionTime":"2026-02-20T16:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.985469 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.985561 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.985591 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.985633 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:51 crc kubenswrapper[4697]: I0220 16:32:51.985666 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:51Z","lastTransitionTime":"2026-02-20T16:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.089425 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.089549 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.089573 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.089622 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.089646 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:52Z","lastTransitionTime":"2026-02-20T16:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.194023 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.194092 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.194194 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.194246 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.194267 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:52Z","lastTransitionTime":"2026-02-20T16:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.298607 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.298684 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.298706 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.298739 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.298764 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:52Z","lastTransitionTime":"2026-02-20T16:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.401547 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.401607 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.401624 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.401652 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.401671 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:52Z","lastTransitionTime":"2026-02-20T16:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.506132 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.506194 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.506213 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.506245 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.506265 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:52Z","lastTransitionTime":"2026-02-20T16:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.611314 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.611400 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.611427 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.611507 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.611532 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:52Z","lastTransitionTime":"2026-02-20T16:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.715190 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.715257 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.715276 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.715306 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.715325 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:52Z","lastTransitionTime":"2026-02-20T16:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.822058 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.822180 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.822202 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.822235 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.822255 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:52Z","lastTransitionTime":"2026-02-20T16:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.871029 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 00:37:21.042784288 +0000 UTC Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.876525 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:52 crc kubenswrapper[4697]: E0220 16:32:52.876766 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.915546 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:52Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.925786 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.925849 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.925873 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.925909 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.925936 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:52Z","lastTransitionTime":"2026-02-20T16:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.938807 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:52Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.963679 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:52Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:52 crc kubenswrapper[4697]: I0220 16:32:52.990355 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:52Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.013592 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.028680 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.028717 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.028731 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.028756 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.028771 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:53Z","lastTransitionTime":"2026-02-20T16:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.037717 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.059475 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.076025 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.091057 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.113678 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.128548 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.132393 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.132470 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.132492 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.132523 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.132546 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:53Z","lastTransitionTime":"2026-02-20T16:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.154091 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:24Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0220 16:32:24.946608 6352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 16:32:24.946783 6352 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.946805 6352 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 16:32:24.946783 6352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 16:32:24.947100 6352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0220 16:32:24.947247 6352 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 16:32:24.947308 6352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0220 16:32:24.947384 6352 factory.go:656] Stopping watch factory\\\\nI0220 16:32:24.947457 6352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0220 16:32:24.947465 6352 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947551 6352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0220 16:32:24.947039 6352 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947560 6352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0220 16:32:24.947591 6352 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.167902 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.182793 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d76d5561-c9fc-4377-b764-5a6856eada68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b231279d61bb858c2d662fd41b388d450db0ed9f92b55f968a334a2ce2b50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1adcc43d1a375bc06218f7d3c94c564132a5f3dd5cde0c7ee1f86883b8100552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b78299504f4a4555d5fa6bd331589ed16effa4428034951de3d8f83ce652780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.204365 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.227634 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.235241 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.235297 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.235321 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.235359 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.235384 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:53Z","lastTransitionTime":"2026-02-20T16:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.244380 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.265652 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8b038b9ead0bc9a97b50c6f4c8bc6e710b43746fc631bec4a60f4514fc68175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:43Z\\\",\\\"message\\\":\\\"2026-02-20T16:31:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b\\\\n2026-02-20T16:31:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b to /host/opt/cni/bin/\\\\n2026-02-20T16:31:58Z [verbose] multus-daemon started\\\\n2026-02-20T16:31:58Z [verbose] Readiness Indicator file check\\\\n2026-02-20T16:32:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:53Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.339068 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.339138 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.339159 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.339192 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.339216 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:53Z","lastTransitionTime":"2026-02-20T16:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.443495 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.443532 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.443543 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.443563 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.443577 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:53Z","lastTransitionTime":"2026-02-20T16:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.545942 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.546000 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.546014 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.546035 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.546053 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:53Z","lastTransitionTime":"2026-02-20T16:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.649625 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.649713 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.649734 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.649764 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.649814 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:53Z","lastTransitionTime":"2026-02-20T16:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.754576 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.754683 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.754716 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.754756 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.754777 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:53Z","lastTransitionTime":"2026-02-20T16:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.860048 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.860120 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.860141 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.860167 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.860185 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:53Z","lastTransitionTime":"2026-02-20T16:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.871210 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 09:04:32.869335093 +0000 UTC Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.876212 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.876351 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:53 crc kubenswrapper[4697]: E0220 16:32:53.876685 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.876861 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:53 crc kubenswrapper[4697]: E0220 16:32:53.877029 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:53 crc kubenswrapper[4697]: E0220 16:32:53.877113 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.963120 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.963174 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.963193 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.963255 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:53 crc kubenswrapper[4697]: I0220 16:32:53.963275 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:53Z","lastTransitionTime":"2026-02-20T16:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.066664 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.066711 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.066723 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.066742 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.066755 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:54Z","lastTransitionTime":"2026-02-20T16:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.170198 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.170237 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.170247 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.170262 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.170274 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:54Z","lastTransitionTime":"2026-02-20T16:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.273565 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.273663 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.273687 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.273724 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.273747 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:54Z","lastTransitionTime":"2026-02-20T16:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.377782 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.377834 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.377845 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.377862 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.377872 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:54Z","lastTransitionTime":"2026-02-20T16:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.481195 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.481259 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.481274 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.481291 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.481304 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:54Z","lastTransitionTime":"2026-02-20T16:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.585172 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.585267 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.585295 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.585332 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.585356 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:54Z","lastTransitionTime":"2026-02-20T16:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.688836 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.688870 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.688880 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.688896 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.688906 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:54Z","lastTransitionTime":"2026-02-20T16:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.791079 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.791110 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.791135 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.791150 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.791159 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:54Z","lastTransitionTime":"2026-02-20T16:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.872263 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 18:26:48.947868514 +0000 UTC Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.877925 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:54 crc kubenswrapper[4697]: E0220 16:32:54.878607 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.879150 4697 scope.go:117] "RemoveContainer" containerID="fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.894811 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.894870 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.894888 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.894912 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:54 crc kubenswrapper[4697]: I0220 16:32:54.894930 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:54Z","lastTransitionTime":"2026-02-20T16:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.002494 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.003190 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.003221 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.003323 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.003414 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:55Z","lastTransitionTime":"2026-02-20T16:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.108336 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.108646 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.109018 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.109054 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.109072 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:55Z","lastTransitionTime":"2026-02-20T16:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.212680 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.212758 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.212783 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.212819 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.212842 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:55Z","lastTransitionTime":"2026-02-20T16:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.316445 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.316518 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.316531 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.316554 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.316572 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:55Z","lastTransitionTime":"2026-02-20T16:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.402459 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/2.log" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.406754 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerStarted","Data":"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41"} Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.407492 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.424398 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.424472 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.424486 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.424510 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.424528 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:55Z","lastTransitionTime":"2026-02-20T16:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.432047 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.457347 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.481388 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.499604 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.515815 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.528290 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.528348 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.528360 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.528382 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.528399 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:55Z","lastTransitionTime":"2026-02-20T16:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.532683 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.547032 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.560741 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.574011 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.589104 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.610420 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.630470 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.631355 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.631397 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.631409 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.631437 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.631471 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:55Z","lastTransitionTime":"2026-02-20T16:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.660415 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8b038b9ead0bc9a97b50c6f4c8bc6e710b43746fc631bec4a60f4514fc68175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:43Z\\\",\\\"message\\\":\\\"2026-02-20T16:31:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b\\\\n2026-02-20T16:31:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b to /host/opt/cni/bin/\\\\n2026-02-20T16:31:58Z [verbose] multus-daemon started\\\\n2026-02-20T16:31:58Z [verbose] Readiness Indicator file check\\\\n2026-02-20T16:32:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.678787 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:24Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0220 16:32:24.946608 6352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 16:32:24.946783 6352 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.946805 6352 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 16:32:24.946783 6352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 16:32:24.947100 6352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0220 16:32:24.947247 6352 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 16:32:24.947308 6352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0220 16:32:24.947384 6352 factory.go:656] Stopping watch factory\\\\nI0220 16:32:24.947457 6352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0220 16:32:24.947465 6352 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947551 6352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0220 16:32:24.947039 6352 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947560 6352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0220 16:32:24.947591 6352 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.691342 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.699507 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.699645 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.699758 4697 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.699784 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:33:59.699740458 +0000 UTC m=+147.479785876 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.699840 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:33:59.69982071 +0000 UTC m=+147.479866138 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.699889 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.700001 4697 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.700048 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:33:59.700038586 +0000 UTC m=+147.480084004 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.705219 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d76d5561-c9fc-4377-b764-5a6856eada68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b231279d61bb858c2d662fd41b388d450db0ed9f92b55f968a334a2ce2b50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1adcc43d1a375bc06218f7d3c94c564132a5f3dd5cde0c7ee1f86883b8100552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b78299504f4a4555d5fa6bd331589ed16effa4428034951de3d8f83ce652780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.726744 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.734741 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.734805 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.734820 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.734846 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.734862 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:55Z","lastTransitionTime":"2026-02-20T16:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.740949 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:55Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.801040 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.801107 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.801275 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.801301 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.801311 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.801322 4697 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.801330 4697 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.801338 4697 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.801398 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 16:33:59.801378473 +0000 UTC m=+147.581423881 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.801419 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 16:33:59.801411094 +0000 UTC m=+147.581456492 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.837884 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.837917 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.837928 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.837944 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.837953 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:55Z","lastTransitionTime":"2026-02-20T16:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.872776 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 14:59:39.705623757 +0000 UTC Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.876140 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.876192 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.876158 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.876327 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.876416 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:55 crc kubenswrapper[4697]: E0220 16:32:55.876664 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.940321 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.940373 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.940391 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.940417 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:55 crc kubenswrapper[4697]: I0220 16:32:55.940439 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:55Z","lastTransitionTime":"2026-02-20T16:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.044253 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.044332 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.044352 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.044384 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.044407 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:56Z","lastTransitionTime":"2026-02-20T16:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.147372 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.147424 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.147457 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.147480 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.147496 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:56Z","lastTransitionTime":"2026-02-20T16:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.250788 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.250843 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.250861 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.250882 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.250899 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:56Z","lastTransitionTime":"2026-02-20T16:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.354393 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.354502 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.354524 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.354556 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.354577 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:56Z","lastTransitionTime":"2026-02-20T16:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.414387 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/3.log" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.415527 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/2.log" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.419528 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" exitCode=1 Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.419583 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41"} Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.419631 4697 scope.go:117] "RemoveContainer" containerID="fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.421886 4697 scope.go:117] "RemoveContainer" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" Feb 20 16:32:56 crc kubenswrapper[4697]: E0220 16:32:56.422193 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.452007 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.457419 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.457523 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.457549 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.457577 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.457597 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:56Z","lastTransitionTime":"2026-02-20T16:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.471880 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.491851 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d76d5561-c9fc-4377-b764-5a6856eada68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b231279d61bb858c2d662fd41b388d450db0ed9f92b55f968a334a2ce2b50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1adcc43d1a375bc06218f7d3c94c564132a5f3dd5cde0c7ee1f86883b8100552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b78299504f4a4555d5fa6bd331589ed16effa4428034951de3d8f83ce652780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.513630 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.538069 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.557596 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.560076 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.560277 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.560416 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.560626 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.560797 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:56Z","lastTransitionTime":"2026-02-20T16:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.582070 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8b038b9ead0bc9a97b50c6f4c8bc6e710b43746fc631bec4a60f4514fc68175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:43Z\\\",\\\"message\\\":\\\"2026-02-20T16:31:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b\\\\n2026-02-20T16:31:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b to /host/opt/cni/bin/\\\\n2026-02-20T16:31:58Z [verbose] multus-daemon started\\\\n2026-02-20T16:31:58Z [verbose] Readiness Indicator file check\\\\n2026-02-20T16:32:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.616192 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac0f0cd202d6e803c6e80fd9a92266844828d7a85dbaf0304dba70f565d153a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:24Z\\\",\\\"message\\\":\\\"tory.go:160\\\\nI0220 16:32:24.946608 6352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 16:32:24.946783 6352 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.946805 6352 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 16:32:24.946783 6352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 16:32:24.947100 6352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0220 16:32:24.947247 6352 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 16:32:24.947308 6352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0220 16:32:24.947384 6352 factory.go:656] Stopping watch factory\\\\nI0220 16:32:24.947457 6352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0220 16:32:24.947465 6352 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947551 6352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0220 16:32:24.947039 6352 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 16:32:24.947560 6352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0220 16:32:24.947591 6352 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:55Z\\\",\\\"message\\\":\\\"ift-etcd/etcd-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-vtsdj openshift-network-diagnostics/network-check-source-55646444c4-trplf]\\\\nI0220 16:32:55.891347 6773 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0220 16:32:55.891378 6773 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jngbq\\\\nI0220 16:32:55.891387 6773 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0220 16:32:55.891391 6773 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.633098 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.659753 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.663633 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.663675 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.663695 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.663722 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.663742 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:56Z","lastTransitionTime":"2026-02-20T16:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.678617 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.699173 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.728804 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.746154 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.766773 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.766850 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.766875 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.766910 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.766936 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:56Z","lastTransitionTime":"2026-02-20T16:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.768578 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.787440 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.801086 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.818788 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:56Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.868894 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.868952 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.868970 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.868997 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.869015 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:56Z","lastTransitionTime":"2026-02-20T16:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.873166 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 19:10:35.697822517 +0000 UTC Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.876765 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:56 crc kubenswrapper[4697]: E0220 16:32:56.877002 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.971450 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.971495 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.971506 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.971527 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:56 crc kubenswrapper[4697]: I0220 16:32:56.971540 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:56Z","lastTransitionTime":"2026-02-20T16:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.075167 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.075237 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.075255 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.075282 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.075300 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:57Z","lastTransitionTime":"2026-02-20T16:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.178835 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.178907 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.178926 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.178954 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.178978 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:57Z","lastTransitionTime":"2026-02-20T16:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.282570 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.282626 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.282645 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.282670 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.282689 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:57Z","lastTransitionTime":"2026-02-20T16:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.385527 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.385944 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.386160 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.386352 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.386546 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:57Z","lastTransitionTime":"2026-02-20T16:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.425497 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/3.log" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.430765 4697 scope.go:117] "RemoveContainer" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" Feb 20 16:32:57 crc kubenswrapper[4697]: E0220 16:32:57.431045 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.455912 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.475265 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.489757 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.489850 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.489875 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.489909 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.489937 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:57Z","lastTransitionTime":"2026-02-20T16:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.495193 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.514329 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d76d5561-c9fc-4377-b764-5a6856eada68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b231279d61bb858c2d662fd41b388d450db0ed9f92b55f968a334a2ce2b50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1adcc43d1a375bc06218f7d3c94c564132a5f3dd5cde0c7ee1f86883b8100552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b78299504f4a4555d5fa6bd331589ed16effa4428034951de3d8f83ce652780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.533018 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.550341 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.567806 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.586715 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8b038b9ead0bc9a97b50c6f4c8bc6e710b43746fc631bec4a60f4514fc68175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:43Z\\\",\\\"message\\\":\\\"2026-02-20T16:31:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b\\\\n2026-02-20T16:31:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b to /host/opt/cni/bin/\\\\n2026-02-20T16:31:58Z [verbose] multus-daemon started\\\\n2026-02-20T16:31:58Z [verbose] Readiness Indicator file check\\\\n2026-02-20T16:32:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.592929 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.594706 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.594792 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.594847 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.594903 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:57Z","lastTransitionTime":"2026-02-20T16:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.615022 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:55Z\\\",\\\"message\\\":\\\"ift-etcd/etcd-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-vtsdj openshift-network-diagnostics/network-check-source-55646444c4-trplf]\\\\nI0220 16:32:55.891347 6773 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0220 16:32:55.891378 6773 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jngbq\\\\nI0220 16:32:55.891387 6773 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0220 16:32:55.891391 6773 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.642016 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.665002 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.684057 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.697764 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.697811 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.697824 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.697850 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.697875 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:57Z","lastTransitionTime":"2026-02-20T16:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.704716 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.718276 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.735020 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.749685 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.762098 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.774274 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:57Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.800691 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.800728 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.800738 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.800751 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.800761 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:57Z","lastTransitionTime":"2026-02-20T16:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.873558 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 02:20:06.612804622 +0000 UTC Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.876997 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.876998 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.877002 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:57 crc kubenswrapper[4697]: E0220 16:32:57.877411 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:57 crc kubenswrapper[4697]: E0220 16:32:57.877462 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:57 crc kubenswrapper[4697]: E0220 16:32:57.877946 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.903228 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.903312 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.903331 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.903356 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:57 crc kubenswrapper[4697]: I0220 16:32:57.903375 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:57Z","lastTransitionTime":"2026-02-20T16:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.005600 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.005661 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.005678 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.005706 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.005730 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:58Z","lastTransitionTime":"2026-02-20T16:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.108960 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.109020 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.109039 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.109066 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.109083 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:58Z","lastTransitionTime":"2026-02-20T16:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.212559 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.212606 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.212622 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.212643 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.212659 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:58Z","lastTransitionTime":"2026-02-20T16:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.315325 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.315373 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.315384 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.315406 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.315420 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:58Z","lastTransitionTime":"2026-02-20T16:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.417183 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.417267 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.417276 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.417292 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.417301 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:58Z","lastTransitionTime":"2026-02-20T16:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.519934 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.519988 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.520005 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.520029 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.520047 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:58Z","lastTransitionTime":"2026-02-20T16:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.622355 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.622391 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.622400 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.622415 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.622426 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:58Z","lastTransitionTime":"2026-02-20T16:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.725663 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.725695 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.725703 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.725717 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.725726 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:58Z","lastTransitionTime":"2026-02-20T16:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.828146 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.828188 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.828200 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.828216 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.828225 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:58Z","lastTransitionTime":"2026-02-20T16:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.874113 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 10:08:42.514171517 +0000 UTC Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.876547 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:32:58 crc kubenswrapper[4697]: E0220 16:32:58.876654 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.887841 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.931452 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.931490 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.931502 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.931520 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:58 crc kubenswrapper[4697]: I0220 16:32:58.931536 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:58Z","lastTransitionTime":"2026-02-20T16:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.034183 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.034240 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.034255 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.034276 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.034289 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:59Z","lastTransitionTime":"2026-02-20T16:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.136926 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.136967 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.136975 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.137009 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.137020 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:59Z","lastTransitionTime":"2026-02-20T16:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.239643 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.239671 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.239680 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.239694 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.239703 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:59Z","lastTransitionTime":"2026-02-20T16:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.342346 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.342404 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.342422 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.342484 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.342508 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:59Z","lastTransitionTime":"2026-02-20T16:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.445491 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.445546 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.445562 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.445586 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.445606 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:59Z","lastTransitionTime":"2026-02-20T16:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.548916 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.548993 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.549021 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.549066 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.549087 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:59Z","lastTransitionTime":"2026-02-20T16:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.652656 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.652723 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.652746 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.652776 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.652798 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:59Z","lastTransitionTime":"2026-02-20T16:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.755670 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.755731 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.755747 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.755773 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.755791 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:59Z","lastTransitionTime":"2026-02-20T16:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.858406 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.858463 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.858479 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.858499 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.858512 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:59Z","lastTransitionTime":"2026-02-20T16:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.874296 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 10:32:30.120436852 +0000 UTC Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.876611 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:32:59 crc kubenswrapper[4697]: E0220 16:32:59.876728 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.876786 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.876840 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:32:59 crc kubenswrapper[4697]: E0220 16:32:59.876968 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:32:59 crc kubenswrapper[4697]: E0220 16:32:59.877668 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.928387 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.928482 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.928498 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.928515 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.928527 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:59Z","lastTransitionTime":"2026-02-20T16:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: E0220 16:32:59.952982 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.958410 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.958512 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.958532 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.958561 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.958579 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:59Z","lastTransitionTime":"2026-02-20T16:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: E0220 16:32:59.978039 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.982624 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.982657 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.982666 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.982681 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:32:59 crc kubenswrapper[4697]: I0220 16:32:59.982692 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:32:59Z","lastTransitionTime":"2026-02-20T16:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:32:59 crc kubenswrapper[4697]: E0220 16:32:59.998657 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:32:59Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.006090 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.006138 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.006151 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.006171 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.006184 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:00Z","lastTransitionTime":"2026-02-20T16:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:00 crc kubenswrapper[4697]: E0220 16:33:00.027205 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.033083 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.033135 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.033152 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.033171 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.033184 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:00Z","lastTransitionTime":"2026-02-20T16:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:00 crc kubenswrapper[4697]: E0220 16:33:00.049288 4697 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T16:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aff1fb75-2d23-4538-af06-66acb56ad245\\\",\\\"systemUUID\\\":\\\"be1cf996-28db-4a15-bd39-36b5992b7e01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:00Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:00 crc kubenswrapper[4697]: E0220 16:33:00.049425 4697 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.051375 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.051406 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.051414 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.051446 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.051458 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:00Z","lastTransitionTime":"2026-02-20T16:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.153380 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.153428 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.153467 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.153497 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.153513 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:00Z","lastTransitionTime":"2026-02-20T16:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.255500 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.255534 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.255542 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.255556 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.255565 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:00Z","lastTransitionTime":"2026-02-20T16:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.358760 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.359108 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.359203 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.359309 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.359396 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:00Z","lastTransitionTime":"2026-02-20T16:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.462173 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.462235 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.462256 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.462282 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.462300 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:00Z","lastTransitionTime":"2026-02-20T16:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.565929 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.566415 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.566653 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.566820 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.566969 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:00Z","lastTransitionTime":"2026-02-20T16:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.670325 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.670387 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.670405 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.670465 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.670496 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:00Z","lastTransitionTime":"2026-02-20T16:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.773857 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.773934 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.773954 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.773981 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.773999 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:00Z","lastTransitionTime":"2026-02-20T16:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.875076 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 00:35:21.770323227 +0000 UTC Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.876628 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:00 crc kubenswrapper[4697]: E0220 16:33:00.876781 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.877251 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.877283 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.877294 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.877310 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.877321 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:00Z","lastTransitionTime":"2026-02-20T16:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.980637 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.980713 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.980739 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.980770 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:00 crc kubenswrapper[4697]: I0220 16:33:00.980791 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:00Z","lastTransitionTime":"2026-02-20T16:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.083309 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.083351 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.083363 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.083382 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.083395 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:01Z","lastTransitionTime":"2026-02-20T16:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.185728 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.185770 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.185782 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.185799 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.185811 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:01Z","lastTransitionTime":"2026-02-20T16:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.288672 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.288739 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.288758 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.288784 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.288803 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:01Z","lastTransitionTime":"2026-02-20T16:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.391617 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.391671 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.391688 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.391716 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.391734 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:01Z","lastTransitionTime":"2026-02-20T16:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.503644 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.503719 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.503739 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.503767 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.503787 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:01Z","lastTransitionTime":"2026-02-20T16:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.607078 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.607226 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.607251 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.607277 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.607295 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:01Z","lastTransitionTime":"2026-02-20T16:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.709716 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.709778 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.709796 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.709825 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.709843 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:01Z","lastTransitionTime":"2026-02-20T16:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.813142 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.813194 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.813211 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.813236 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.813252 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:01Z","lastTransitionTime":"2026-02-20T16:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.876257 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 17:26:44.5486225 +0000 UTC Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.876540 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.876611 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:01 crc kubenswrapper[4697]: E0220 16:33:01.877041 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:01 crc kubenswrapper[4697]: E0220 16:33:01.877155 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.877786 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:01 crc kubenswrapper[4697]: E0220 16:33:01.878042 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.916183 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.916240 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.916260 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.916284 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:01 crc kubenswrapper[4697]: I0220 16:33:01.916301 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:01Z","lastTransitionTime":"2026-02-20T16:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.020269 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.021272 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.021325 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.021351 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.021365 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:02Z","lastTransitionTime":"2026-02-20T16:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.124582 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.124643 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.124660 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.124688 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.124711 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:02Z","lastTransitionTime":"2026-02-20T16:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.227660 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.227726 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.227748 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.227775 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.227793 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:02Z","lastTransitionTime":"2026-02-20T16:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.331214 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.331318 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.331374 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.331403 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.331425 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:02Z","lastTransitionTime":"2026-02-20T16:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.434271 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.434320 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.434337 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.434363 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.434381 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:02Z","lastTransitionTime":"2026-02-20T16:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.537123 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.537183 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.537201 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.537228 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.537247 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:02Z","lastTransitionTime":"2026-02-20T16:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.640228 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.640273 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.640285 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.640303 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.640316 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:02Z","lastTransitionTime":"2026-02-20T16:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.742809 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.742877 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.742897 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.742927 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.742953 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:02Z","lastTransitionTime":"2026-02-20T16:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.846041 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.846100 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.846119 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.846146 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.846163 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:02Z","lastTransitionTime":"2026-02-20T16:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.876811 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 20:25:48.456199784 +0000 UTC Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.877272 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:02 crc kubenswrapper[4697]: E0220 16:33:02.877611 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.895592 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dac7433987a9aebf12f8325232d720f3d7cb2c3753d65d08eaaa69a137c8a67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ef8f2d34d0f86eb1ae27cb2195d5cc5e5981f59b547f6363c47845c469a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.911946 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nvnfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e96c5eb-134a-4c03-9899-8f97a9aba0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95426d8df3a04e0d650e0bbc00412c08695be7db9ebbbd34ea392e2e1f7507d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nvnfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.932290 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lrpxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de5dc4e-ef42-48fc-be23-eaec2039c031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8b038b9ead0bc9a97b50c6f4c8bc6e710b43746fc631bec4a60f4514fc68175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:43Z\\\",\\\"message\\\":\\\"2026-02-20T16:31:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b\\\\n2026-02-20T16:31:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_afc547c2-28c6-4f50-8381-613f4c40d47b to /host/opt/cni/bin/\\\\n2026-02-20T16:31:58Z [verbose] multus-daemon started\\\\n2026-02-20T16:31:58Z [verbose] Readiness Indicator file check\\\\n2026-02-20T16:32:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-956hf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lrpxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.948239 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.948553 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.948763 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.948939 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.949094 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:02Z","lastTransitionTime":"2026-02-20T16:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.958011 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99eb233c-7094-4a86-ab37-0b160001bbef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T16:32:55Z\\\",\\\"message\\\":\\\"ift-etcd/etcd-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-vtsdj openshift-network-diagnostics/network-check-source-55646444c4-trplf]\\\\nI0220 16:32:55.891347 6773 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0220 16:32:55.891378 6773 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jngbq\\\\nI0220 16:32:55.891387 6773 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0220 16:32:55.891391 6773 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5wwd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9zpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.976070 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb182755-c6aa-48db-8be3-2c3e23b4b41b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da06b2005e2900fe48f5d424d6fdc9fe6555a13a4c95959bcd2119a25a70301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1272fc7dd9d6eea71069daf4578b3a34bc7fa50aa07f4587d79425cffd51a84e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssvmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2x8hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:02 crc kubenswrapper[4697]: I0220 16:33:02.993661 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d76d5561-c9fc-4377-b764-5a6856eada68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9b231279d61bb858c2d662fd41b388d450db0ed9f92b55f968a334a2ce2b50f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1adcc43d1a375bc06218f7d3c94c564132a5f3dd5cde0c7ee1f86883b8100552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b78299504f4a4555d5fa6bd331589ed16effa4428034951de3d8f83ce652780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd4d9b515e7bfbdd3834655c269112653b2bf14cf9c157886bfe05f795d2dbdd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:02Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.012883 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.031480 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.051482 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.051519 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.051527 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.051541 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.051549 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:03Z","lastTransitionTime":"2026-02-20T16:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.054567 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f571603b-6223-4f16-b5fa-019ef7c4abb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bc3584fbb2f3a71e3542587c72a2c3e99fa5a79624df78b4f46a57b99131d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f5357a4479554907b126f1546c0158f14128ad85dde29acf9e99846ff1883a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4939d90b68631a230abe379ed31cbe37cbbaf3d5d74fbf517a3187775aa483c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d18482d64c9671f344248ba3ff258de0f8690017c994e85201effebc66ec7a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01a89aef0bbc3923273b1b409ef1f0899c98e8a70e94c2e2ed72985173d9e7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://246215646f3753d6eed52d6b568b9e72c201490b98728ef01428440df34458d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be41308ab92899ce3699cffca61e012a2e289f42ab4db607071c2a7367206be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-br8r5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vtsdj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.066978 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba970a98-5bee-40d6-ade6-6dcbed87b581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5853bf4a454aeb39b5f32302c7b3b1c8a84f3efccf16eff571b4374520fbe58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q6tc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bgvrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.100012 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e092a56d-c256-426f-81ce-ee924c280ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5c5d311d6f3d6d4162cfd700d3aed632800dac45195ca226ebb7e124d747dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e1ba3543676ec079805b8bf480dc8bac06c95c60b55dd940366d926931ca312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4421ef2e154b7dcf3d4124d70fe6c614328e5fa866c963b878328ae865b97e1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6383fc89c4de32e4a93ad9293b3d303a73352cb7be92f261f22d87adde61142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f12903aac5d422b30d2baed1141a2e0a628e6cbd1a8d06493e662f2debbbdb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c65c4dc577fcae2b0e62c69d169b55f1c6a2c0b8311d244ef55fd23c1455e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d9fabf6047a37ed51cfd5e8d6cd6efb8221a3f06e2b775063dc4fe6e96623c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c5bffe9f8420e595bd117accc6acad8a3ff283aa2a3f77d4e94208e0a83277f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.123253 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f59f9e2-de79-485a-b5fd-d4d65365f47f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 16:31:46.543617 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 16:31:46.546805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705857787/tls.crt::/tmp/serving-cert-3705857787/tls.key\\\\\\\"\\\\nI0220 16:31:51.829339 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 16:31:51.833617 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 16:31:51.833650 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 16:31:51.833692 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 16:31:51.833703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 16:31:51.848235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 16:31:51.848297 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848306 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 16:31:51.848315 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 16:31:51.848322 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 16:31:51.848328 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 16:31:51.848334 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 16:31:51.848559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 16:31:51.849928 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.153964 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.154020 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.154035 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.154080 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.154096 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:03Z","lastTransitionTime":"2026-02-20T16:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.167801 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jngbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"502233fe-4219-44a5-9ddb-66eae7401369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f635d0ba2b03e67942aa2ef13b8ab4eb78a9cfa5c61d62dd6f03a8c38954a6b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s484d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jngbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.180732 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nskrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aff33f1-a871-41df-a6f1-fd7146e23a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:32:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bg8hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:32:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nskrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.198145 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"927518f7-a7a4-40f5-b6cf-fea7307727c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f151b7227feeea40703ba55b332bc8deaf90a48f6ffb7209f71263794bcfb8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb5193fa5ed7b44daa85da3cf8199bfae69efa056eaa6c6e3a459254eac4bf6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe2e7c00e970d70138d935fd84bfc819a576485edec6b4f1d5b148ec28aadc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.210758 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.222500 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f17d681d82cd6bea59f35c7dc83d730abbdf72f9bff9d642caefa0f2c0db2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.233082 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"971b6d83-6531-43d4-8b71-812317d2748f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05ef3dc6cc8417e0d55e7e4d1bb011b274fbb1bc1e9d314812205b8366d71c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1b2b976d031977f5193be6bb9d09f5648acb271099f04495cfd723e8f6f55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c1b2b976d031977f5193be6bb9d09f5648acb271099f04495cfd723e8f6f55f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T16:31:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T16:31:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T16:31:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.244990 4697 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T16:31:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0c04086a2b33b17bd046d945a1ce0453416ea8389c7b9ac9e5bcb50ca87a38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T16:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T16:33:03Z is after 2025-08-24T17:21:41Z" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.256565 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.256603 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.256612 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.256626 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.256637 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:03Z","lastTransitionTime":"2026-02-20T16:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.359347 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.359479 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.359498 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.359524 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.359541 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:03Z","lastTransitionTime":"2026-02-20T16:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.461097 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.461602 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.461636 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.461659 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.461672 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:03Z","lastTransitionTime":"2026-02-20T16:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.564534 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.564611 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.564630 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.564655 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.564674 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:03Z","lastTransitionTime":"2026-02-20T16:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.667894 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.667953 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.667964 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.667983 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.667995 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:03Z","lastTransitionTime":"2026-02-20T16:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.771552 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.771612 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.771629 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.771659 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.771677 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:03Z","lastTransitionTime":"2026-02-20T16:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.874946 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.875006 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.875026 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.875051 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.875070 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:03Z","lastTransitionTime":"2026-02-20T16:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.876479 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.876517 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.876489 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:03 crc kubenswrapper[4697]: E0220 16:33:03.876699 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:03 crc kubenswrapper[4697]: E0220 16:33:03.876825 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:03 crc kubenswrapper[4697]: E0220 16:33:03.876956 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.877017 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 13:58:22.781469488 +0000 UTC Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.978174 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.978210 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.978219 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.978237 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:03 crc kubenswrapper[4697]: I0220 16:33:03.978248 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:03Z","lastTransitionTime":"2026-02-20T16:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.081083 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.081276 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.081287 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.081304 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.081313 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:04Z","lastTransitionTime":"2026-02-20T16:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.184410 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.184508 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.184526 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.184555 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.184572 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:04Z","lastTransitionTime":"2026-02-20T16:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.287363 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.287420 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.287474 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.287505 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.287528 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:04Z","lastTransitionTime":"2026-02-20T16:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.390408 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.390499 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.390516 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.390541 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.390559 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:04Z","lastTransitionTime":"2026-02-20T16:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.493491 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.493547 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.493564 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.493588 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.493605 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:04Z","lastTransitionTime":"2026-02-20T16:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.596889 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.596940 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.596956 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.596979 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.596996 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:04Z","lastTransitionTime":"2026-02-20T16:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.700237 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.700279 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.700290 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.700305 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.700318 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:04Z","lastTransitionTime":"2026-02-20T16:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.803243 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.803356 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.803378 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.803413 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.803474 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:04Z","lastTransitionTime":"2026-02-20T16:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.876712 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:04 crc kubenswrapper[4697]: E0220 16:33:04.876934 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.877176 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 08:38:07.452958031 +0000 UTC Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.906886 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.906965 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.906989 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.907018 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:04 crc kubenswrapper[4697]: I0220 16:33:04.907036 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:04Z","lastTransitionTime":"2026-02-20T16:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.010069 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.010138 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.010171 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.010202 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.010223 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:05Z","lastTransitionTime":"2026-02-20T16:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.113192 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.113270 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.113284 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.113302 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.113317 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:05Z","lastTransitionTime":"2026-02-20T16:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.216851 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.216901 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.216913 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.216932 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.216944 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:05Z","lastTransitionTime":"2026-02-20T16:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.319774 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.319841 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.319858 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.319883 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.319901 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:05Z","lastTransitionTime":"2026-02-20T16:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.422948 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.423017 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.423039 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.423069 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.423089 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:05Z","lastTransitionTime":"2026-02-20T16:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.526072 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.526107 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.526118 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.526136 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.526151 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:05Z","lastTransitionTime":"2026-02-20T16:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.628706 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.628744 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.628754 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.628773 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.628785 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:05Z","lastTransitionTime":"2026-02-20T16:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.731978 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.732045 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.732102 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.732131 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.732150 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:05Z","lastTransitionTime":"2026-02-20T16:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.835033 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.835083 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.835099 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.835124 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.835140 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:05Z","lastTransitionTime":"2026-02-20T16:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.875985 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:05 crc kubenswrapper[4697]: E0220 16:33:05.876264 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.876300 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.876362 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:05 crc kubenswrapper[4697]: E0220 16:33:05.876616 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.877430 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 22:12:17.913026642 +0000 UTC Feb 20 16:33:05 crc kubenswrapper[4697]: E0220 16:33:05.877717 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.937793 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.937866 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.937887 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.937917 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:05 crc kubenswrapper[4697]: I0220 16:33:05.937937 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:05Z","lastTransitionTime":"2026-02-20T16:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.040538 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.040593 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.040609 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.040635 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.040653 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:06Z","lastTransitionTime":"2026-02-20T16:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.144269 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.144357 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.144382 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.144414 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.144462 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:06Z","lastTransitionTime":"2026-02-20T16:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.246920 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.246968 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.246980 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.246999 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.247011 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:06Z","lastTransitionTime":"2026-02-20T16:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.349993 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.350059 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.350076 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.350100 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.350119 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:06Z","lastTransitionTime":"2026-02-20T16:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.452804 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.452867 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.452885 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.453003 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.453025 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:06Z","lastTransitionTime":"2026-02-20T16:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.555971 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.556014 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.556027 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.556046 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.556059 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:06Z","lastTransitionTime":"2026-02-20T16:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.658580 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.658625 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.658641 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.658667 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.658686 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:06Z","lastTransitionTime":"2026-02-20T16:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.761096 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.761126 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.761135 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.761150 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.761159 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:06Z","lastTransitionTime":"2026-02-20T16:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.863770 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.863830 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.863846 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.863870 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.863889 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:06Z","lastTransitionTime":"2026-02-20T16:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.876427 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:06 crc kubenswrapper[4697]: E0220 16:33:06.876596 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.877851 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 22:03:15.469512747 +0000 UTC Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.966913 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.966949 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.966962 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.967006 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:06 crc kubenswrapper[4697]: I0220 16:33:06.967019 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:06Z","lastTransitionTime":"2026-02-20T16:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.070428 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.070522 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.070540 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.070567 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.070583 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:07Z","lastTransitionTime":"2026-02-20T16:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.172860 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.172903 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.172918 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.172941 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.172953 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:07Z","lastTransitionTime":"2026-02-20T16:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.274641 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.274669 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.274678 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.274691 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.274719 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:07Z","lastTransitionTime":"2026-02-20T16:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.376470 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.376503 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.376529 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.376545 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.376555 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:07Z","lastTransitionTime":"2026-02-20T16:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.479301 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.479380 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.479399 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.479426 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.479496 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:07Z","lastTransitionTime":"2026-02-20T16:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.582894 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.582970 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.582993 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.583022 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.583044 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:07Z","lastTransitionTime":"2026-02-20T16:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.685874 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.685926 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.685944 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.685968 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.685985 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:07Z","lastTransitionTime":"2026-02-20T16:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.789106 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.789151 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.789167 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.789191 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.789208 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:07Z","lastTransitionTime":"2026-02-20T16:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.876357 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.876387 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.877025 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:07 crc kubenswrapper[4697]: E0220 16:33:07.877160 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:07 crc kubenswrapper[4697]: E0220 16:33:07.877243 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:07 crc kubenswrapper[4697]: E0220 16:33:07.877346 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.878669 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 18:01:58.255950327 +0000 UTC Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.891681 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.891735 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.891752 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.891772 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.891789 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:07Z","lastTransitionTime":"2026-02-20T16:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.994089 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.994155 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.994177 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.994206 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:07 crc kubenswrapper[4697]: I0220 16:33:07.994228 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:07Z","lastTransitionTime":"2026-02-20T16:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.097521 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.097579 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.097595 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.097619 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.097642 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:08Z","lastTransitionTime":"2026-02-20T16:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.200901 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.200942 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.200952 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.200970 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.200980 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:08Z","lastTransitionTime":"2026-02-20T16:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.303011 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.303087 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.303110 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.303141 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.303167 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:08Z","lastTransitionTime":"2026-02-20T16:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.406327 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.406402 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.406420 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.406469 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.406487 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:08Z","lastTransitionTime":"2026-02-20T16:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.509986 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.510044 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.510061 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.510084 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.510100 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:08Z","lastTransitionTime":"2026-02-20T16:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.613233 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.613312 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.613337 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.613366 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.613387 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:08Z","lastTransitionTime":"2026-02-20T16:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.716547 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.716824 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.716897 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.716984 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.717054 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:08Z","lastTransitionTime":"2026-02-20T16:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.820404 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.820487 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.820505 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.820533 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.820549 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:08Z","lastTransitionTime":"2026-02-20T16:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.876810 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:08 crc kubenswrapper[4697]: E0220 16:33:08.876999 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.878892 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 09:35:07.734143842 +0000 UTC Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.924550 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.924623 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.924642 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.924672 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:08 crc kubenswrapper[4697]: I0220 16:33:08.924691 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:08Z","lastTransitionTime":"2026-02-20T16:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.028152 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.028196 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.028211 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.028231 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.028246 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:09Z","lastTransitionTime":"2026-02-20T16:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.131655 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.132031 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.132137 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.132241 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.132337 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:09Z","lastTransitionTime":"2026-02-20T16:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.234632 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.234662 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.234670 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.234687 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.234697 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:09Z","lastTransitionTime":"2026-02-20T16:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.337469 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.337494 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.337503 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.337516 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.337528 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:09Z","lastTransitionTime":"2026-02-20T16:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.440681 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.440783 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.440806 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.440831 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.440850 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:09Z","lastTransitionTime":"2026-02-20T16:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.542851 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.542919 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.542942 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.542971 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.543023 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:09Z","lastTransitionTime":"2026-02-20T16:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.645090 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.645139 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.645151 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.645169 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.645181 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:09Z","lastTransitionTime":"2026-02-20T16:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.748623 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.748689 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.748796 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.748825 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.748847 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:09Z","lastTransitionTime":"2026-02-20T16:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.851042 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.851103 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.851124 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.851157 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.851182 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:09Z","lastTransitionTime":"2026-02-20T16:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.877125 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.877158 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.877196 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:09 crc kubenswrapper[4697]: E0220 16:33:09.877314 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:09 crc kubenswrapper[4697]: E0220 16:33:09.877558 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:09 crc kubenswrapper[4697]: E0220 16:33:09.878205 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.878631 4697 scope.go:117] "RemoveContainer" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" Feb 20 16:33:09 crc kubenswrapper[4697]: E0220 16:33:09.878881 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.879242 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 06:36:19.060522397 +0000 UTC Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.953744 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.953807 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.953824 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.953848 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:09 crc kubenswrapper[4697]: I0220 16:33:09.953864 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:09Z","lastTransitionTime":"2026-02-20T16:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.056720 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.056790 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.056808 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.056835 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.056853 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:10Z","lastTransitionTime":"2026-02-20T16:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.159342 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.159400 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.159417 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.159465 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.159483 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:10Z","lastTransitionTime":"2026-02-20T16:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.160957 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.161009 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.161021 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.161055 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.161071 4697 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T16:33:10Z","lastTransitionTime":"2026-02-20T16:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.219105 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8"] Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.219725 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.222502 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.222807 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.222870 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.222907 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.244793 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=79.244764692 podStartE2EDuration="1m19.244764692s" podCreationTimestamp="2026-02-20 16:31:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:10.242597275 +0000 UTC m=+98.022642703" watchObservedRunningTime="2026-02-20 16:33:10.244764692 +0000 UTC m=+98.024810140" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.289519 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jngbq" podStartSLOduration=75.28949445 podStartE2EDuration="1m15.28949445s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:10.275704267 +0000 UTC m=+98.055749685" watchObservedRunningTime="2026-02-20 16:33:10.28949445 +0000 UTC m=+98.069539878" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.314034 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=12.314012935 podStartE2EDuration="12.314012935s" podCreationTimestamp="2026-02-20 16:32:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:10.31379385 +0000 UTC m=+98.093839278" watchObservedRunningTime="2026-02-20 16:33:10.314012935 +0000 UTC m=+98.094058353" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.366390 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.366427 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.366481 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.366518 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.366546 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.367527 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.367503813 podStartE2EDuration="43.367503813s" podCreationTimestamp="2026-02-20 16:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:10.352591801 +0000 UTC m=+98.132637199" watchObservedRunningTime="2026-02-20 16:33:10.367503813 +0000 UTC m=+98.147549221" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.398963 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nvnfb" podStartSLOduration=75.398940741 podStartE2EDuration="1m15.398940741s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:10.398729465 +0000 UTC m=+98.178774883" watchObservedRunningTime="2026-02-20 16:33:10.398940741 +0000 UTC m=+98.178986149" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.411167 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lrpxf" podStartSLOduration=75.411144772 podStartE2EDuration="1m15.411144772s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:10.410946337 +0000 UTC m=+98.190991775" watchObservedRunningTime="2026-02-20 16:33:10.411144772 +0000 UTC m=+98.191190180" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.453520 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2x8hx" podStartSLOduration=74.453496467 podStartE2EDuration="1m14.453496467s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:10.452594083 +0000 UTC m=+98.232639511" watchObservedRunningTime="2026-02-20 16:33:10.453496467 +0000 UTC m=+98.233541895" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.467783 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.467830 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.467866 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.467916 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.468026 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.468016 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.468127 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.468697 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.476011 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=77.4759984 podStartE2EDuration="1m17.4759984s" podCreationTimestamp="2026-02-20 16:31:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:10.475330052 +0000 UTC m=+98.255375450" watchObservedRunningTime="2026-02-20 16:33:10.4759984 +0000 UTC m=+98.256044018" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.477314 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.488549 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96d97fad-1d2a-468e-8020-e90f4cbc8ba0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xtnn8\" (UID: \"96d97fad-1d2a-468e-8020-e90f4cbc8ba0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.514952 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.514931825 podStartE2EDuration="1m18.514931825s" podCreationTimestamp="2026-02-20 16:31:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:10.499342784 +0000 UTC m=+98.279388192" watchObservedRunningTime="2026-02-20 16:33:10.514931825 +0000 UTC m=+98.294977233" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.530874 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vtsdj" podStartSLOduration=75.530854584 podStartE2EDuration="1m15.530854584s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:10.530240528 +0000 UTC m=+98.310285946" watchObservedRunningTime="2026-02-20 16:33:10.530854584 +0000 UTC m=+98.310899992" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.542512 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podStartSLOduration=75.54249816 podStartE2EDuration="1m15.54249816s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:10.541521285 +0000 UTC m=+98.321566703" watchObservedRunningTime="2026-02-20 16:33:10.54249816 +0000 UTC m=+98.322543568" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.553135 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.876930 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:10 crc kubenswrapper[4697]: E0220 16:33:10.877302 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.879419 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 14:21:41.248502078 +0000 UTC Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.879492 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 20 16:33:10 crc kubenswrapper[4697]: I0220 16:33:10.888072 4697 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 20 16:33:11 crc kubenswrapper[4697]: I0220 16:33:11.476495 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" event={"ID":"96d97fad-1d2a-468e-8020-e90f4cbc8ba0","Type":"ContainerStarted","Data":"e00da27992c601eda620074406c8456a0993b508a047700019a51c857b546ec9"} Feb 20 16:33:11 crc kubenswrapper[4697]: I0220 16:33:11.476566 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" event={"ID":"96d97fad-1d2a-468e-8020-e90f4cbc8ba0","Type":"ContainerStarted","Data":"7d76915bf3d067ad952dd521d88ecb866865891d36730629807b0bbd67d7d68b"} Feb 20 16:33:11 crc kubenswrapper[4697]: I0220 16:33:11.876905 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:11 crc kubenswrapper[4697]: I0220 16:33:11.876950 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:11 crc kubenswrapper[4697]: I0220 16:33:11.877013 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:11 crc kubenswrapper[4697]: E0220 16:33:11.877182 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:11 crc kubenswrapper[4697]: E0220 16:33:11.877242 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:11 crc kubenswrapper[4697]: E0220 16:33:11.877297 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:12 crc kubenswrapper[4697]: I0220 16:33:12.876554 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:12 crc kubenswrapper[4697]: E0220 16:33:12.878834 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:13 crc kubenswrapper[4697]: I0220 16:33:13.876715 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:13 crc kubenswrapper[4697]: I0220 16:33:13.876761 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:13 crc kubenswrapper[4697]: I0220 16:33:13.876708 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:13 crc kubenswrapper[4697]: E0220 16:33:13.876908 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:13 crc kubenswrapper[4697]: E0220 16:33:13.876980 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:13 crc kubenswrapper[4697]: E0220 16:33:13.877138 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:14 crc kubenswrapper[4697]: I0220 16:33:14.408001 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:14 crc kubenswrapper[4697]: E0220 16:33:14.408170 4697 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:33:14 crc kubenswrapper[4697]: E0220 16:33:14.408281 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs podName:0aff33f1-a871-41df-a6f1-fd7146e23a9c nodeName:}" failed. No retries permitted until 2026-02-20 16:34:18.408245427 +0000 UTC m=+166.188290855 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs") pod "network-metrics-daemon-nskrw" (UID: "0aff33f1-a871-41df-a6f1-fd7146e23a9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 16:33:14 crc kubenswrapper[4697]: I0220 16:33:14.877157 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:14 crc kubenswrapper[4697]: E0220 16:33:14.877362 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:15 crc kubenswrapper[4697]: I0220 16:33:15.876914 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:15 crc kubenswrapper[4697]: I0220 16:33:15.877086 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:15 crc kubenswrapper[4697]: E0220 16:33:15.877354 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:15 crc kubenswrapper[4697]: E0220 16:33:15.877604 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:15 crc kubenswrapper[4697]: I0220 16:33:15.877651 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:15 crc kubenswrapper[4697]: E0220 16:33:15.877732 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:16 crc kubenswrapper[4697]: I0220 16:33:16.876935 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:16 crc kubenswrapper[4697]: E0220 16:33:16.877413 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:17 crc kubenswrapper[4697]: I0220 16:33:17.876739 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:17 crc kubenswrapper[4697]: E0220 16:33:17.877131 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:17 crc kubenswrapper[4697]: I0220 16:33:17.876788 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:17 crc kubenswrapper[4697]: I0220 16:33:17.876789 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:17 crc kubenswrapper[4697]: E0220 16:33:17.877470 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:17 crc kubenswrapper[4697]: E0220 16:33:17.877720 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:18 crc kubenswrapper[4697]: I0220 16:33:18.877012 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:18 crc kubenswrapper[4697]: E0220 16:33:18.877277 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:19 crc kubenswrapper[4697]: I0220 16:33:19.876742 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:19 crc kubenswrapper[4697]: I0220 16:33:19.876814 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:19 crc kubenswrapper[4697]: I0220 16:33:19.876853 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:19 crc kubenswrapper[4697]: E0220 16:33:19.876980 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:19 crc kubenswrapper[4697]: E0220 16:33:19.877307 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:19 crc kubenswrapper[4697]: E0220 16:33:19.877590 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:20 crc kubenswrapper[4697]: I0220 16:33:20.876707 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:20 crc kubenswrapper[4697]: E0220 16:33:20.877821 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:21 crc kubenswrapper[4697]: I0220 16:33:21.876328 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:21 crc kubenswrapper[4697]: I0220 16:33:21.876410 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:21 crc kubenswrapper[4697]: E0220 16:33:21.876671 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:21 crc kubenswrapper[4697]: I0220 16:33:21.876685 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:21 crc kubenswrapper[4697]: E0220 16:33:21.876868 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:21 crc kubenswrapper[4697]: E0220 16:33:21.876918 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:22 crc kubenswrapper[4697]: I0220 16:33:22.877071 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:22 crc kubenswrapper[4697]: E0220 16:33:22.880420 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:23 crc kubenswrapper[4697]: I0220 16:33:23.876415 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:23 crc kubenswrapper[4697]: I0220 16:33:23.876485 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:23 crc kubenswrapper[4697]: I0220 16:33:23.876496 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:23 crc kubenswrapper[4697]: E0220 16:33:23.876649 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:23 crc kubenswrapper[4697]: E0220 16:33:23.876879 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:23 crc kubenswrapper[4697]: E0220 16:33:23.876984 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:24 crc kubenswrapper[4697]: I0220 16:33:24.877013 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:24 crc kubenswrapper[4697]: E0220 16:33:24.877182 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:24 crc kubenswrapper[4697]: I0220 16:33:24.878254 4697 scope.go:117] "RemoveContainer" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" Feb 20 16:33:24 crc kubenswrapper[4697]: E0220 16:33:24.878560 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" Feb 20 16:33:25 crc kubenswrapper[4697]: I0220 16:33:25.877072 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:25 crc kubenswrapper[4697]: I0220 16:33:25.877110 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:25 crc kubenswrapper[4697]: I0220 16:33:25.877109 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:25 crc kubenswrapper[4697]: E0220 16:33:25.877320 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:25 crc kubenswrapper[4697]: E0220 16:33:25.877530 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:25 crc kubenswrapper[4697]: E0220 16:33:25.877660 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:26 crc kubenswrapper[4697]: I0220 16:33:26.877012 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:26 crc kubenswrapper[4697]: E0220 16:33:26.877179 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:27 crc kubenswrapper[4697]: I0220 16:33:27.876693 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:27 crc kubenswrapper[4697]: I0220 16:33:27.876772 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:27 crc kubenswrapper[4697]: E0220 16:33:27.876863 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:27 crc kubenswrapper[4697]: E0220 16:33:27.877038 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:27 crc kubenswrapper[4697]: I0220 16:33:27.876701 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:27 crc kubenswrapper[4697]: E0220 16:33:27.877167 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:28 crc kubenswrapper[4697]: I0220 16:33:28.876499 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:28 crc kubenswrapper[4697]: E0220 16:33:28.876712 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:29 crc kubenswrapper[4697]: I0220 16:33:29.876954 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:29 crc kubenswrapper[4697]: I0220 16:33:29.876954 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:29 crc kubenswrapper[4697]: I0220 16:33:29.876988 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:29 crc kubenswrapper[4697]: E0220 16:33:29.877176 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:29 crc kubenswrapper[4697]: E0220 16:33:29.877352 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:29 crc kubenswrapper[4697]: E0220 16:33:29.877386 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:30 crc kubenswrapper[4697]: I0220 16:33:30.540543 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lrpxf_1de5dc4e-ef42-48fc-be23-eaec2039c031/kube-multus/1.log" Feb 20 16:33:30 crc kubenswrapper[4697]: I0220 16:33:30.541236 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lrpxf_1de5dc4e-ef42-48fc-be23-eaec2039c031/kube-multus/0.log" Feb 20 16:33:30 crc kubenswrapper[4697]: I0220 16:33:30.541326 4697 generic.go:334] "Generic (PLEG): container finished" podID="1de5dc4e-ef42-48fc-be23-eaec2039c031" containerID="a8b038b9ead0bc9a97b50c6f4c8bc6e710b43746fc631bec4a60f4514fc68175" exitCode=1 Feb 20 16:33:30 crc kubenswrapper[4697]: I0220 16:33:30.541382 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lrpxf" event={"ID":"1de5dc4e-ef42-48fc-be23-eaec2039c031","Type":"ContainerDied","Data":"a8b038b9ead0bc9a97b50c6f4c8bc6e710b43746fc631bec4a60f4514fc68175"} Feb 20 16:33:30 crc kubenswrapper[4697]: I0220 16:33:30.541530 4697 scope.go:117] "RemoveContainer" containerID="d9bd60e701bc55f1c9a6d5fccb3fe4a0da0acfc7da11838b07124d2b64b99280" Feb 20 16:33:30 crc kubenswrapper[4697]: I0220 16:33:30.543111 4697 scope.go:117] "RemoveContainer" containerID="a8b038b9ead0bc9a97b50c6f4c8bc6e710b43746fc631bec4a60f4514fc68175" Feb 20 16:33:30 crc kubenswrapper[4697]: E0220 16:33:30.543609 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-lrpxf_openshift-multus(1de5dc4e-ef42-48fc-be23-eaec2039c031)\"" pod="openshift-multus/multus-lrpxf" podUID="1de5dc4e-ef42-48fc-be23-eaec2039c031" Feb 20 16:33:30 crc kubenswrapper[4697]: I0220 16:33:30.572957 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xtnn8" podStartSLOduration=95.572938223 podStartE2EDuration="1m35.572938223s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:11.494181304 +0000 UTC m=+99.274226752" watchObservedRunningTime="2026-02-20 16:33:30.572938223 +0000 UTC m=+118.352983631" Feb 20 16:33:30 crc kubenswrapper[4697]: I0220 16:33:30.876912 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:30 crc kubenswrapper[4697]: E0220 16:33:30.877084 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:31 crc kubenswrapper[4697]: I0220 16:33:31.548616 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lrpxf_1de5dc4e-ef42-48fc-be23-eaec2039c031/kube-multus/1.log" Feb 20 16:33:31 crc kubenswrapper[4697]: I0220 16:33:31.876547 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:31 crc kubenswrapper[4697]: I0220 16:33:31.876687 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:31 crc kubenswrapper[4697]: E0220 16:33:31.876726 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:31 crc kubenswrapper[4697]: I0220 16:33:31.876554 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:31 crc kubenswrapper[4697]: E0220 16:33:31.876990 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:31 crc kubenswrapper[4697]: E0220 16:33:31.877209 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:32 crc kubenswrapper[4697]: E0220 16:33:32.835245 4697 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 20 16:33:32 crc kubenswrapper[4697]: I0220 16:33:32.876591 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:32 crc kubenswrapper[4697]: E0220 16:33:32.878330 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:33 crc kubenswrapper[4697]: E0220 16:33:33.004396 4697 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 16:33:33 crc kubenswrapper[4697]: I0220 16:33:33.876196 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:33 crc kubenswrapper[4697]: I0220 16:33:33.876295 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:33 crc kubenswrapper[4697]: I0220 16:33:33.876310 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:33 crc kubenswrapper[4697]: E0220 16:33:33.876402 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:33 crc kubenswrapper[4697]: E0220 16:33:33.876561 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:33 crc kubenswrapper[4697]: E0220 16:33:33.876689 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:34 crc kubenswrapper[4697]: I0220 16:33:34.876200 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:34 crc kubenswrapper[4697]: E0220 16:33:34.876392 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:35 crc kubenswrapper[4697]: I0220 16:33:35.876767 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:35 crc kubenswrapper[4697]: I0220 16:33:35.876863 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:35 crc kubenswrapper[4697]: E0220 16:33:35.876947 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:35 crc kubenswrapper[4697]: I0220 16:33:35.876965 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:35 crc kubenswrapper[4697]: E0220 16:33:35.877145 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:35 crc kubenswrapper[4697]: I0220 16:33:35.880580 4697 scope.go:117] "RemoveContainer" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" Feb 20 16:33:35 crc kubenswrapper[4697]: E0220 16:33:35.880984 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9zpdc_openshift-ovn-kubernetes(99eb233c-7094-4a86-ab37-0b160001bbef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" Feb 20 16:33:35 crc kubenswrapper[4697]: E0220 16:33:35.881571 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:36 crc kubenswrapper[4697]: I0220 16:33:36.876481 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:36 crc kubenswrapper[4697]: E0220 16:33:36.876677 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:37 crc kubenswrapper[4697]: I0220 16:33:37.876472 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:37 crc kubenswrapper[4697]: I0220 16:33:37.876525 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:37 crc kubenswrapper[4697]: E0220 16:33:37.876604 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:37 crc kubenswrapper[4697]: I0220 16:33:37.876524 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:37 crc kubenswrapper[4697]: E0220 16:33:37.876717 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:37 crc kubenswrapper[4697]: E0220 16:33:37.876861 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:38 crc kubenswrapper[4697]: E0220 16:33:38.006183 4697 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 16:33:38 crc kubenswrapper[4697]: I0220 16:33:38.876861 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:38 crc kubenswrapper[4697]: E0220 16:33:38.877066 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:39 crc kubenswrapper[4697]: I0220 16:33:39.876550 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:39 crc kubenswrapper[4697]: I0220 16:33:39.876612 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:39 crc kubenswrapper[4697]: E0220 16:33:39.876679 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:39 crc kubenswrapper[4697]: I0220 16:33:39.876614 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:39 crc kubenswrapper[4697]: E0220 16:33:39.876735 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:39 crc kubenswrapper[4697]: E0220 16:33:39.876866 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:40 crc kubenswrapper[4697]: I0220 16:33:40.876225 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:40 crc kubenswrapper[4697]: E0220 16:33:40.876494 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:41 crc kubenswrapper[4697]: I0220 16:33:41.876863 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:41 crc kubenswrapper[4697]: I0220 16:33:41.876863 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:41 crc kubenswrapper[4697]: I0220 16:33:41.877603 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:41 crc kubenswrapper[4697]: E0220 16:33:41.877752 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:41 crc kubenswrapper[4697]: E0220 16:33:41.877879 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:41 crc kubenswrapper[4697]: E0220 16:33:41.878077 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:42 crc kubenswrapper[4697]: I0220 16:33:42.876714 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:42 crc kubenswrapper[4697]: E0220 16:33:42.878287 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:42 crc kubenswrapper[4697]: I0220 16:33:42.878344 4697 scope.go:117] "RemoveContainer" containerID="a8b038b9ead0bc9a97b50c6f4c8bc6e710b43746fc631bec4a60f4514fc68175" Feb 20 16:33:43 crc kubenswrapper[4697]: E0220 16:33:43.007609 4697 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 16:33:43 crc kubenswrapper[4697]: I0220 16:33:43.593070 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lrpxf_1de5dc4e-ef42-48fc-be23-eaec2039c031/kube-multus/1.log" Feb 20 16:33:43 crc kubenswrapper[4697]: I0220 16:33:43.593718 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lrpxf" event={"ID":"1de5dc4e-ef42-48fc-be23-eaec2039c031","Type":"ContainerStarted","Data":"3c189a1fdd8a35950990c7aaff7044115c85864154e3618a32b8b7eaf68d188d"} Feb 20 16:33:43 crc kubenswrapper[4697]: I0220 16:33:43.876676 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:43 crc kubenswrapper[4697]: I0220 16:33:43.876736 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:43 crc kubenswrapper[4697]: I0220 16:33:43.876697 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:43 crc kubenswrapper[4697]: E0220 16:33:43.876825 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:43 crc kubenswrapper[4697]: E0220 16:33:43.876960 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:43 crc kubenswrapper[4697]: E0220 16:33:43.877085 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:44 crc kubenswrapper[4697]: I0220 16:33:44.876531 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:44 crc kubenswrapper[4697]: E0220 16:33:44.876720 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:45 crc kubenswrapper[4697]: I0220 16:33:45.876136 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:45 crc kubenswrapper[4697]: I0220 16:33:45.876153 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:45 crc kubenswrapper[4697]: E0220 16:33:45.876378 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:45 crc kubenswrapper[4697]: I0220 16:33:45.876181 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:45 crc kubenswrapper[4697]: E0220 16:33:45.876504 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:45 crc kubenswrapper[4697]: E0220 16:33:45.876772 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:46 crc kubenswrapper[4697]: I0220 16:33:46.877187 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:46 crc kubenswrapper[4697]: E0220 16:33:46.877335 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:47 crc kubenswrapper[4697]: I0220 16:33:47.876165 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:47 crc kubenswrapper[4697]: I0220 16:33:47.876165 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:47 crc kubenswrapper[4697]: E0220 16:33:47.876583 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:47 crc kubenswrapper[4697]: I0220 16:33:47.876200 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:47 crc kubenswrapper[4697]: E0220 16:33:47.876505 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:47 crc kubenswrapper[4697]: E0220 16:33:47.877294 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:48 crc kubenswrapper[4697]: E0220 16:33:48.009531 4697 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 16:33:48 crc kubenswrapper[4697]: I0220 16:33:48.877192 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:48 crc kubenswrapper[4697]: E0220 16:33:48.877479 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:49 crc kubenswrapper[4697]: I0220 16:33:49.876034 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:49 crc kubenswrapper[4697]: I0220 16:33:49.876066 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:49 crc kubenswrapper[4697]: I0220 16:33:49.876118 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:49 crc kubenswrapper[4697]: E0220 16:33:49.876225 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:49 crc kubenswrapper[4697]: E0220 16:33:49.876351 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:49 crc kubenswrapper[4697]: E0220 16:33:49.876487 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:49 crc kubenswrapper[4697]: I0220 16:33:49.877134 4697 scope.go:117] "RemoveContainer" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" Feb 20 16:33:50 crc kubenswrapper[4697]: I0220 16:33:50.617119 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/3.log" Feb 20 16:33:50 crc kubenswrapper[4697]: I0220 16:33:50.619460 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerStarted","Data":"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f"} Feb 20 16:33:50 crc kubenswrapper[4697]: I0220 16:33:50.619896 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:33:50 crc kubenswrapper[4697]: I0220 16:33:50.646875 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podStartSLOduration=115.646857195 podStartE2EDuration="1m55.646857195s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:33:50.642447064 +0000 UTC m=+138.422492482" watchObservedRunningTime="2026-02-20 16:33:50.646857195 +0000 UTC m=+138.426902603" Feb 20 16:33:50 crc kubenswrapper[4697]: I0220 16:33:50.661346 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nskrw"] Feb 20 16:33:50 crc kubenswrapper[4697]: I0220 16:33:50.661509 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:50 crc kubenswrapper[4697]: E0220 16:33:50.661614 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:50 crc kubenswrapper[4697]: I0220 16:33:50.876317 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:50 crc kubenswrapper[4697]: E0220 16:33:50.876519 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:51 crc kubenswrapper[4697]: I0220 16:33:51.876868 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:51 crc kubenswrapper[4697]: E0220 16:33:51.877329 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:51 crc kubenswrapper[4697]: I0220 16:33:51.876979 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:51 crc kubenswrapper[4697]: I0220 16:33:51.876869 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:51 crc kubenswrapper[4697]: E0220 16:33:51.877414 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:51 crc kubenswrapper[4697]: E0220 16:33:51.877627 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:52 crc kubenswrapper[4697]: I0220 16:33:52.876712 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:52 crc kubenswrapper[4697]: E0220 16:33:52.879762 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:53 crc kubenswrapper[4697]: E0220 16:33:53.010946 4697 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 16:33:53 crc kubenswrapper[4697]: I0220 16:33:53.876184 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:53 crc kubenswrapper[4697]: E0220 16:33:53.876376 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:53 crc kubenswrapper[4697]: I0220 16:33:53.876718 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:53 crc kubenswrapper[4697]: I0220 16:33:53.876925 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:53 crc kubenswrapper[4697]: E0220 16:33:53.877084 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:53 crc kubenswrapper[4697]: E0220 16:33:53.877251 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:54 crc kubenswrapper[4697]: I0220 16:33:54.877036 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:54 crc kubenswrapper[4697]: E0220 16:33:54.877299 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:55 crc kubenswrapper[4697]: I0220 16:33:55.876770 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:55 crc kubenswrapper[4697]: I0220 16:33:55.876824 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:55 crc kubenswrapper[4697]: I0220 16:33:55.876824 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:55 crc kubenswrapper[4697]: E0220 16:33:55.876994 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:55 crc kubenswrapper[4697]: E0220 16:33:55.877152 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:55 crc kubenswrapper[4697]: E0220 16:33:55.877284 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:56 crc kubenswrapper[4697]: I0220 16:33:56.877151 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:56 crc kubenswrapper[4697]: E0220 16:33:56.877472 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 16:33:57 crc kubenswrapper[4697]: I0220 16:33:57.876099 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:57 crc kubenswrapper[4697]: I0220 16:33:57.876176 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:57 crc kubenswrapper[4697]: E0220 16:33:57.876272 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 16:33:57 crc kubenswrapper[4697]: I0220 16:33:57.876376 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:57 crc kubenswrapper[4697]: E0220 16:33:57.876575 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 16:33:57 crc kubenswrapper[4697]: E0220 16:33:57.876741 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nskrw" podUID="0aff33f1-a871-41df-a6f1-fd7146e23a9c" Feb 20 16:33:58 crc kubenswrapper[4697]: I0220 16:33:58.876503 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:58 crc kubenswrapper[4697]: I0220 16:33:58.880046 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 20 16:33:58 crc kubenswrapper[4697]: I0220 16:33:58.886113 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.707543 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:33:59 crc kubenswrapper[4697]: E0220 16:33:59.707708 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:36:01.707676063 +0000 UTC m=+269.487721511 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.707769 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.707846 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:59 crc kubenswrapper[4697]: E0220 16:33:59.707999 4697 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:33:59 crc kubenswrapper[4697]: E0220 16:33:59.708032 4697 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:33:59 crc kubenswrapper[4697]: E0220 16:33:59.708102 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:36:01.708074089 +0000 UTC m=+269.488119527 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 16:33:59 crc kubenswrapper[4697]: E0220 16:33:59.708141 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 16:36:01.70812268 +0000 UTC m=+269.488168118 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.809104 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.809226 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.818739 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.818960 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.876797 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.876869 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.876817 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.880790 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.882729 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.882746 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.882856 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 20 16:33:59 crc kubenswrapper[4697]: I0220 16:33:59.916787 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 16:34:00 crc kubenswrapper[4697]: I0220 16:34:00.101570 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:34:00 crc kubenswrapper[4697]: W0220 16:34:00.247636 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-2c4336e3450cfe1f67abebc3b36914e4d3e1bbf2283efbc1a4a3e6c23979e334 WatchSource:0}: Error finding container 2c4336e3450cfe1f67abebc3b36914e4d3e1bbf2283efbc1a4a3e6c23979e334: Status 404 returned error can't find the container with id 2c4336e3450cfe1f67abebc3b36914e4d3e1bbf2283efbc1a4a3e6c23979e334 Feb 20 16:34:00 crc kubenswrapper[4697]: W0220 16:34:00.353092 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-4e35591d564cbf4ab42580ad835b68a742ef54603a3faff9911bad6d0bef1593 WatchSource:0}: Error finding container 4e35591d564cbf4ab42580ad835b68a742ef54603a3faff9911bad6d0bef1593: Status 404 returned error can't find the container with id 4e35591d564cbf4ab42580ad835b68a742ef54603a3faff9911bad6d0bef1593 Feb 20 16:34:00 crc kubenswrapper[4697]: I0220 16:34:00.657997 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2728e9357cc706d61c1d36aa3e6f913fc9f7a7472cbcf8a525155446913b567d"} Feb 20 16:34:00 crc kubenswrapper[4697]: I0220 16:34:00.658379 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2c4336e3450cfe1f67abebc3b36914e4d3e1bbf2283efbc1a4a3e6c23979e334"} Feb 20 16:34:00 crc kubenswrapper[4697]: I0220 16:34:00.659793 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9b8615aa35b707ab67f2ea011570643c845d898e6bc33eba43fb953d06aae87c"} Feb 20 16:34:00 crc kubenswrapper[4697]: I0220 16:34:00.659821 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4e35591d564cbf4ab42580ad835b68a742ef54603a3faff9911bad6d0bef1593"} Feb 20 16:34:00 crc kubenswrapper[4697]: I0220 16:34:00.660009 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.185475 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.185574 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.262979 4697 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.325139 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fp67j"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.326124 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.332785 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.332812 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.333069 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.333279 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.333911 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.333932 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.335594 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-t7pft"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.335959 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t7pft" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.337156 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.337658 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.338903 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.339901 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.340288 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.340893 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.350983 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qj8gc"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.352038 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.356886 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m8rl6"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.357771 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.358308 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.358373 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.358654 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.358723 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.358815 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.358897 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.358990 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.359012 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.359290 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.359479 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.358655 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.359636 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.359814 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.360005 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.360530 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.362378 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.365829 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.367011 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f6d4m"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.367478 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.368159 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wsc22"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.368972 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wsc22" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.371101 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cm59t"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.371524 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.373665 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.373833 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.373897 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.373984 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.374425 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-txsqk"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.374607 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.374796 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.374859 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.374940 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.375193 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.375261 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.375301 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.375394 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.375522 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.375564 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.375622 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.375726 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.375895 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.376073 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.376422 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.376888 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.377100 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.377215 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.377939 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.377997 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.378532 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.378608 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.378610 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.381170 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.384197 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.385034 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.385925 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.387203 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.387910 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.388394 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.388678 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.389674 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s4bgz"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.390176 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mqvww"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.391966 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.392472 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.392713 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.393561 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.395802 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.395985 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.396828 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.397504 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.397738 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.397827 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.397906 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.403111 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.405237 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.405668 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.405916 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.406158 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.406480 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.406675 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.409156 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.409928 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.410212 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.412101 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.412687 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.412975 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.413210 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.413942 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.414240 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.414329 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kwgmg"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.414583 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.414819 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.415529 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.415569 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.415677 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.415784 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.417085 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.417394 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.417528 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.417538 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.417696 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.426066 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.440104 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.440310 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.440530 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.440654 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.440823 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.441017 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.441164 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.441383 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.441533 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.442065 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.442160 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.442368 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.442486 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.442783 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.443842 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444171 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmqkq\" (UniqueName: \"kubernetes.io/projected/5a156d77-af5f-4ff3-be63-279a07ea90f9-kube-api-access-fmqkq\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444203 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf-config\") pod \"console-operator-58897d9998-qj8gc\" (UID: \"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf\") " pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444235 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a156d77-af5f-4ff3-be63-279a07ea90f9-serving-cert\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444253 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/09dc2933-a279-4e68-8587-dffed2d34a72-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hxqpv\" (UID: \"09dc2933-a279-4e68-8587-dffed2d34a72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444275 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf879bc2-82c1-4794-b3e6-5c7f3a238b47-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jgvqx\" (UID: \"bf879bc2-82c1-4794-b3e6-5c7f3a238b47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444290 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a156d77-af5f-4ff3-be63-279a07ea90f9-audit-dir\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444308 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkc4z\" (UniqueName: \"kubernetes.io/projected/09dc2933-a279-4e68-8587-dffed2d34a72-kube-api-access-lkc4z\") pod \"openshift-config-operator-7777fb866f-hxqpv\" (UID: \"09dc2933-a279-4e68-8587-dffed2d34a72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444324 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf879bc2-82c1-4794-b3e6-5c7f3a238b47-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jgvqx\" (UID: \"bf879bc2-82c1-4794-b3e6-5c7f3a238b47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444341 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4049af-a7dd-47b0-8dea-da1d662031c5-serving-cert\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444356 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf-trusted-ca\") pod \"console-operator-58897d9998-qj8gc\" (UID: \"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf\") " pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444370 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444386 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a156d77-af5f-4ff3-be63-279a07ea90f9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444408 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09dc2933-a279-4e68-8587-dffed2d34a72-serving-cert\") pod \"openshift-config-operator-7777fb866f-hxqpv\" (UID: \"09dc2933-a279-4e68-8587-dffed2d34a72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444425 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a156d77-af5f-4ff3-be63-279a07ea90f9-encryption-config\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444457 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf-serving-cert\") pod \"console-operator-58897d9998-qj8gc\" (UID: \"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf\") " pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444471 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a156d77-af5f-4ff3-be63-279a07ea90f9-audit-policies\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444489 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a156d77-af5f-4ff3-be63-279a07ea90f9-etcd-client\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444514 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvx84\" (UniqueName: \"kubernetes.io/projected/4a4049af-a7dd-47b0-8dea-da1d662031c5-kube-api-access-lvx84\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444528 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-config\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444546 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf879bc2-82c1-4794-b3e6-5c7f3a238b47-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jgvqx\" (UID: \"bf879bc2-82c1-4794-b3e6-5c7f3a238b47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444561 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-776tn\" (UniqueName: \"kubernetes.io/projected/b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf-kube-api-access-776tn\") pod \"console-operator-58897d9998-qj8gc\" (UID: \"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf\") " pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444577 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-client-ca\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444594 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xnn2\" (UniqueName: \"kubernetes.io/projected/bf879bc2-82c1-4794-b3e6-5c7f3a238b47-kube-api-access-2xnn2\") pod \"cluster-image-registry-operator-dc59b4c8b-jgvqx\" (UID: \"bf879bc2-82c1-4794-b3e6-5c7f3a238b47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444612 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdcdq\" (UniqueName: \"kubernetes.io/projected/314a7b24-36b9-41de-9ec3-3c229fc43b3d-kube-api-access-jdcdq\") pod \"downloads-7954f5f757-t7pft\" (UID: \"314a7b24-36b9-41de-9ec3-3c229fc43b3d\") " pod="openshift-console/downloads-7954f5f757-t7pft" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.444629 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a156d77-af5f-4ff3-be63-279a07ea90f9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.445179 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.445937 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.446330 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.447112 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.447541 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.447546 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.449370 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.451126 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.451735 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.458966 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.459350 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.459608 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.459663 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.460864 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.462136 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8r2q4"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.464282 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8r2q4" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.466067 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-rxpbh"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.466465 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.468181 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.476471 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6ct9m"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.477090 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ct9m" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.478863 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.479195 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.479636 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.480105 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.482368 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.483449 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.485086 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m6zk6"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.485504 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.487197 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.487932 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.488243 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.488631 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.489099 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.492585 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.493545 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.505180 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.506706 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.506874 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.507387 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.507610 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.509175 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.509306 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.509335 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.511067 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.513026 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t7pft"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.514755 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lztm2"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.516156 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.521578 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.528554 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.531647 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fp67j"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.533265 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g58sg"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.534538 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.534585 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.540502 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.541405 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f6d4m"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.542507 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qj8gc"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.543612 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wsc22"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.544562 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545260 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qm6jb\" (UID: \"ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545294 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5869f1e-9909-4326-a2be-9e363563c3d0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5dstl\" (UID: \"a5869f1e-9909-4326-a2be-9e363563c3d0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545318 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvx84\" (UniqueName: \"kubernetes.io/projected/4a4049af-a7dd-47b0-8dea-da1d662031c5-kube-api-access-lvx84\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545334 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-config\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545350 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf879bc2-82c1-4794-b3e6-5c7f3a238b47-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jgvqx\" (UID: \"bf879bc2-82c1-4794-b3e6-5c7f3a238b47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545368 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-776tn\" (UniqueName: \"kubernetes.io/projected/b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf-kube-api-access-776tn\") pod \"console-operator-58897d9998-qj8gc\" (UID: \"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf\") " pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545385 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xnn2\" (UniqueName: \"kubernetes.io/projected/bf879bc2-82c1-4794-b3e6-5c7f3a238b47-kube-api-access-2xnn2\") pod \"cluster-image-registry-operator-dc59b4c8b-jgvqx\" (UID: \"bf879bc2-82c1-4794-b3e6-5c7f3a238b47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545400 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdcdq\" (UniqueName: \"kubernetes.io/projected/314a7b24-36b9-41de-9ec3-3c229fc43b3d-kube-api-access-jdcdq\") pod \"downloads-7954f5f757-t7pft\" (UID: \"314a7b24-36b9-41de-9ec3-3c229fc43b3d\") " pod="openshift-console/downloads-7954f5f757-t7pft" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545418 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-client-ca\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545451 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c9b32a5-4d7e-423d-bacb-38148c5a0e38-config\") pod \"machine-approver-56656f9798-5ctcg\" (UID: \"9c9b32a5-4d7e-423d-bacb-38148c5a0e38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545469 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a156d77-af5f-4ff3-be63-279a07ea90f9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545587 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-txsqk"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545797 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e44b077-c323-48e8-be50-124f0a01d7d1-serving-cert\") pod \"route-controller-manager-6576b87f9c-qvhc2\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545831 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b94e775-c465-4a54-858c-a3f82a6de290-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zp2wm\" (UID: \"1b94e775-c465-4a54-858c-a3f82a6de290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545860 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5869f1e-9909-4326-a2be-9e363563c3d0-trusted-ca\") pod \"ingress-operator-5b745b69d9-5dstl\" (UID: \"a5869f1e-9909-4326-a2be-9e363563c3d0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545884 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqkq\" (UniqueName: \"kubernetes.io/projected/5a156d77-af5f-4ff3-be63-279a07ea90f9-kube-api-access-fmqkq\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545903 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5869f1e-9909-4326-a2be-9e363563c3d0-metrics-tls\") pod \"ingress-operator-5b745b69d9-5dstl\" (UID: \"a5869f1e-9909-4326-a2be-9e363563c3d0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545921 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf-config\") pod \"console-operator-58897d9998-qj8gc\" (UID: \"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf\") " pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545943 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phqrt\" (UniqueName: \"kubernetes.io/projected/3e44b077-c323-48e8-be50-124f0a01d7d1-kube-api-access-phqrt\") pod \"route-controller-manager-6576b87f9c-qvhc2\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.545962 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qm6jb\" (UID: \"ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546002 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/09dc2933-a279-4e68-8587-dffed2d34a72-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hxqpv\" (UID: \"09dc2933-a279-4e68-8587-dffed2d34a72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546019 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a156d77-af5f-4ff3-be63-279a07ea90f9-serving-cert\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546042 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf879bc2-82c1-4794-b3e6-5c7f3a238b47-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jgvqx\" (UID: \"bf879bc2-82c1-4794-b3e6-5c7f3a238b47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546058 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a156d77-af5f-4ff3-be63-279a07ea90f9-audit-dir\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546076 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9c9b32a5-4d7e-423d-bacb-38148c5a0e38-auth-proxy-config\") pod \"machine-approver-56656f9798-5ctcg\" (UID: \"9c9b32a5-4d7e-423d-bacb-38148c5a0e38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546099 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9c9b32a5-4d7e-423d-bacb-38148c5a0e38-machine-approver-tls\") pod \"machine-approver-56656f9798-5ctcg\" (UID: \"9c9b32a5-4d7e-423d-bacb-38148c5a0e38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546121 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e44b077-c323-48e8-be50-124f0a01d7d1-client-ca\") pod \"route-controller-manager-6576b87f9c-qvhc2\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546145 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkc4z\" (UniqueName: \"kubernetes.io/projected/09dc2933-a279-4e68-8587-dffed2d34a72-kube-api-access-lkc4z\") pod \"openshift-config-operator-7777fb866f-hxqpv\" (UID: \"09dc2933-a279-4e68-8587-dffed2d34a72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546162 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf879bc2-82c1-4794-b3e6-5c7f3a238b47-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jgvqx\" (UID: \"bf879bc2-82c1-4794-b3e6-5c7f3a238b47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546178 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4049af-a7dd-47b0-8dea-da1d662031c5-serving-cert\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546201 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf-trusted-ca\") pod \"console-operator-58897d9998-qj8gc\" (UID: \"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf\") " pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546218 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546235 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jr7c\" (UniqueName: \"kubernetes.io/projected/1b94e775-c465-4a54-858c-a3f82a6de290-kube-api-access-9jr7c\") pod \"openshift-apiserver-operator-796bbdcf4f-zp2wm\" (UID: \"1b94e775-c465-4a54-858c-a3f82a6de290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546266 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn65l\" (UniqueName: \"kubernetes.io/projected/a5869f1e-9909-4326-a2be-9e363563c3d0-kube-api-access-wn65l\") pod \"ingress-operator-5b745b69d9-5dstl\" (UID: \"a5869f1e-9909-4326-a2be-9e363563c3d0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546281 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b94e775-c465-4a54-858c-a3f82a6de290-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zp2wm\" (UID: \"1b94e775-c465-4a54-858c-a3f82a6de290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546305 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a156d77-af5f-4ff3-be63-279a07ea90f9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546335 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09dc2933-a279-4e68-8587-dffed2d34a72-serving-cert\") pod \"openshift-config-operator-7777fb866f-hxqpv\" (UID: \"09dc2933-a279-4e68-8587-dffed2d34a72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546354 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94-config\") pod \"kube-apiserver-operator-766d6c64bb-qm6jb\" (UID: \"ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546371 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e44b077-c323-48e8-be50-124f0a01d7d1-config\") pod \"route-controller-manager-6576b87f9c-qvhc2\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546391 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a156d77-af5f-4ff3-be63-279a07ea90f9-encryption-config\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546408 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf-serving-cert\") pod \"console-operator-58897d9998-qj8gc\" (UID: \"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf\") " pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546426 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a156d77-af5f-4ff3-be63-279a07ea90f9-audit-policies\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546461 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2xfj\" (UniqueName: \"kubernetes.io/projected/9c9b32a5-4d7e-423d-bacb-38148c5a0e38-kube-api-access-j2xfj\") pod \"machine-approver-56656f9798-5ctcg\" (UID: \"9c9b32a5-4d7e-423d-bacb-38148c5a0e38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546482 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a156d77-af5f-4ff3-be63-279a07ea90f9-etcd-client\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546801 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a156d77-af5f-4ff3-be63-279a07ea90f9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.546930 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-config\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.547317 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cm59t"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.547455 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m8rl6"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.547496 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-client-ca\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.547803 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf-config\") pod \"console-operator-58897d9998-qj8gc\" (UID: \"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf\") " pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.548208 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a156d77-af5f-4ff3-be63-279a07ea90f9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.548607 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf879bc2-82c1-4794-b3e6-5c7f3a238b47-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jgvqx\" (UID: \"bf879bc2-82c1-4794-b3e6-5c7f3a238b47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.548888 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.549167 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.549241 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a156d77-af5f-4ff3-be63-279a07ea90f9-audit-dir\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.549288 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf-trusted-ca\") pod \"console-operator-58897d9998-qj8gc\" (UID: \"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf\") " pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.549794 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a156d77-af5f-4ff3-be63-279a07ea90f9-audit-policies\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.550102 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/09dc2933-a279-4e68-8587-dffed2d34a72-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hxqpv\" (UID: \"09dc2933-a279-4e68-8587-dffed2d34a72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.550577 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.552470 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.552755 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a156d77-af5f-4ff3-be63-279a07ea90f9-encryption-config\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.553635 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf-serving-cert\") pod \"console-operator-58897d9998-qj8gc\" (UID: \"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf\") " pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.554836 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a156d77-af5f-4ff3-be63-279a07ea90f9-serving-cert\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.555026 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4049af-a7dd-47b0-8dea-da1d662031c5-serving-cert\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.555344 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf879bc2-82c1-4794-b3e6-5c7f3a238b47-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jgvqx\" (UID: \"bf879bc2-82c1-4794-b3e6-5c7f3a238b47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.557118 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mqvww"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.558258 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.559460 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8r2q4"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.562462 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.563725 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.564722 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.566959 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.567607 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kwgmg"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.567989 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a156d77-af5f-4ff3-be63-279a07ea90f9-etcd-client\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.568171 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.569150 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.569792 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s4bgz"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.571296 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5vp6h"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.572393 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.573269 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.574288 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-g6ltg"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.576550 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.576638 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g6ltg" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.578484 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6ct9m"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.578541 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.581609 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09dc2933-a279-4e68-8587-dffed2d34a72-serving-cert\") pod \"openshift-config-operator-7777fb866f-hxqpv\" (UID: \"09dc2933-a279-4e68-8587-dffed2d34a72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.581665 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.581681 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.581927 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.584516 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m6zk6"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.585190 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5vp6h"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.586121 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.588251 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.588596 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lztm2"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.588635 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.589836 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.590789 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g6ltg"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.591868 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g58sg"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.592779 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-wljw5"] Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.593279 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wljw5" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.617066 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.642927 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647109 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qm6jb\" (UID: \"ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647144 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5869f1e-9909-4326-a2be-9e363563c3d0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5dstl\" (UID: \"a5869f1e-9909-4326-a2be-9e363563c3d0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647186 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c9b32a5-4d7e-423d-bacb-38148c5a0e38-config\") pod \"machine-approver-56656f9798-5ctcg\" (UID: \"9c9b32a5-4d7e-423d-bacb-38148c5a0e38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647219 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e44b077-c323-48e8-be50-124f0a01d7d1-serving-cert\") pod \"route-controller-manager-6576b87f9c-qvhc2\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647234 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b94e775-c465-4a54-858c-a3f82a6de290-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zp2wm\" (UID: \"1b94e775-c465-4a54-858c-a3f82a6de290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647259 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5869f1e-9909-4326-a2be-9e363563c3d0-trusted-ca\") pod \"ingress-operator-5b745b69d9-5dstl\" (UID: \"a5869f1e-9909-4326-a2be-9e363563c3d0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647281 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5869f1e-9909-4326-a2be-9e363563c3d0-metrics-tls\") pod \"ingress-operator-5b745b69d9-5dstl\" (UID: \"a5869f1e-9909-4326-a2be-9e363563c3d0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647302 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phqrt\" (UniqueName: \"kubernetes.io/projected/3e44b077-c323-48e8-be50-124f0a01d7d1-kube-api-access-phqrt\") pod \"route-controller-manager-6576b87f9c-qvhc2\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647324 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qm6jb\" (UID: \"ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647353 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9c9b32a5-4d7e-423d-bacb-38148c5a0e38-auth-proxy-config\") pod \"machine-approver-56656f9798-5ctcg\" (UID: \"9c9b32a5-4d7e-423d-bacb-38148c5a0e38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647372 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9c9b32a5-4d7e-423d-bacb-38148c5a0e38-machine-approver-tls\") pod \"machine-approver-56656f9798-5ctcg\" (UID: \"9c9b32a5-4d7e-423d-bacb-38148c5a0e38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647389 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e44b077-c323-48e8-be50-124f0a01d7d1-client-ca\") pod \"route-controller-manager-6576b87f9c-qvhc2\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647424 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jr7c\" (UniqueName: \"kubernetes.io/projected/1b94e775-c465-4a54-858c-a3f82a6de290-kube-api-access-9jr7c\") pod \"openshift-apiserver-operator-796bbdcf4f-zp2wm\" (UID: \"1b94e775-c465-4a54-858c-a3f82a6de290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647477 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn65l\" (UniqueName: \"kubernetes.io/projected/a5869f1e-9909-4326-a2be-9e363563c3d0-kube-api-access-wn65l\") pod \"ingress-operator-5b745b69d9-5dstl\" (UID: \"a5869f1e-9909-4326-a2be-9e363563c3d0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647492 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b94e775-c465-4a54-858c-a3f82a6de290-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zp2wm\" (UID: \"1b94e775-c465-4a54-858c-a3f82a6de290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647515 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94-config\") pod \"kube-apiserver-operator-766d6c64bb-qm6jb\" (UID: \"ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647531 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e44b077-c323-48e8-be50-124f0a01d7d1-config\") pod \"route-controller-manager-6576b87f9c-qvhc2\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.647551 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2xfj\" (UniqueName: \"kubernetes.io/projected/9c9b32a5-4d7e-423d-bacb-38148c5a0e38-kube-api-access-j2xfj\") pod \"machine-approver-56656f9798-5ctcg\" (UID: \"9c9b32a5-4d7e-423d-bacb-38148c5a0e38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.650717 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94-config\") pod \"kube-apiserver-operator-766d6c64bb-qm6jb\" (UID: \"ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.650810 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b94e775-c465-4a54-858c-a3f82a6de290-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zp2wm\" (UID: \"1b94e775-c465-4a54-858c-a3f82a6de290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.650993 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5869f1e-9909-4326-a2be-9e363563c3d0-trusted-ca\") pod \"ingress-operator-5b745b69d9-5dstl\" (UID: \"a5869f1e-9909-4326-a2be-9e363563c3d0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.651142 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.651889 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5869f1e-9909-4326-a2be-9e363563c3d0-metrics-tls\") pod \"ingress-operator-5b745b69d9-5dstl\" (UID: \"a5869f1e-9909-4326-a2be-9e363563c3d0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.652999 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e44b077-c323-48e8-be50-124f0a01d7d1-client-ca\") pod \"route-controller-manager-6576b87f9c-qvhc2\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.653146 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b94e775-c465-4a54-858c-a3f82a6de290-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zp2wm\" (UID: \"1b94e775-c465-4a54-858c-a3f82a6de290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.653353 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e44b077-c323-48e8-be50-124f0a01d7d1-config\") pod \"route-controller-manager-6576b87f9c-qvhc2\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.660937 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qm6jb\" (UID: \"ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.670052 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.672196 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e44b077-c323-48e8-be50-124f0a01d7d1-serving-cert\") pod \"route-controller-manager-6576b87f9c-qvhc2\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.689101 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.709493 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.729457 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.748735 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.768955 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.788879 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.809161 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.828581 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.849398 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.869113 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.889493 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.909047 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.929651 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.969486 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 20 16:34:01 crc kubenswrapper[4697]: I0220 16:34:01.989007 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.009526 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.015048 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9c9b32a5-4d7e-423d-bacb-38148c5a0e38-machine-approver-tls\") pod \"machine-approver-56656f9798-5ctcg\" (UID: \"9c9b32a5-4d7e-423d-bacb-38148c5a0e38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.033181 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.049707 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.058948 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9c9b32a5-4d7e-423d-bacb-38148c5a0e38-auth-proxy-config\") pod \"machine-approver-56656f9798-5ctcg\" (UID: \"9c9b32a5-4d7e-423d-bacb-38148c5a0e38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.069303 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.078164 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c9b32a5-4d7e-423d-bacb-38148c5a0e38-config\") pod \"machine-approver-56656f9798-5ctcg\" (UID: \"9c9b32a5-4d7e-423d-bacb-38148c5a0e38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.109608 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.129831 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.150716 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.169667 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.189569 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.208668 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.229536 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.249750 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.270381 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.289103 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.309058 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.329811 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.349713 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.369549 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.389617 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.408895 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.429768 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.449724 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.469660 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.486872 4697 request.go:700] Waited for 1.00322216s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.488901 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.510107 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.530063 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.550338 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.568892 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.600516 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.610299 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.629033 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.650110 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.668990 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.689585 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.708609 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.729322 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.748205 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.769607 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.788745 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.809408 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.828995 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.848545 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.868322 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.895118 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.908733 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.929278 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.949942 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.969946 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 20 16:34:02 crc kubenswrapper[4697]: I0220 16:34:02.988883 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.008372 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.028783 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.049655 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.069175 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.089836 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.108094 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.130004 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.149948 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.169653 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.203971 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvx84\" (UniqueName: \"kubernetes.io/projected/4a4049af-a7dd-47b0-8dea-da1d662031c5-kube-api-access-lvx84\") pod \"controller-manager-879f6c89f-fp67j\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.235586 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-776tn\" (UniqueName: \"kubernetes.io/projected/b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf-kube-api-access-776tn\") pod \"console-operator-58897d9998-qj8gc\" (UID: \"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf\") " pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.242940 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdcdq\" (UniqueName: \"kubernetes.io/projected/314a7b24-36b9-41de-9ec3-3c229fc43b3d-kube-api-access-jdcdq\") pod \"downloads-7954f5f757-t7pft\" (UID: \"314a7b24-36b9-41de-9ec3-3c229fc43b3d\") " pod="openshift-console/downloads-7954f5f757-t7pft" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.252646 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.265668 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkc4z\" (UniqueName: \"kubernetes.io/projected/09dc2933-a279-4e68-8587-dffed2d34a72-kube-api-access-lkc4z\") pod \"openshift-config-operator-7777fb866f-hxqpv\" (UID: \"09dc2933-a279-4e68-8587-dffed2d34a72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.289834 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xnn2\" (UniqueName: \"kubernetes.io/projected/bf879bc2-82c1-4794-b3e6-5c7f3a238b47-kube-api-access-2xnn2\") pod \"cluster-image-registry-operator-dc59b4c8b-jgvqx\" (UID: \"bf879bc2-82c1-4794-b3e6-5c7f3a238b47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.303413 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmqkq\" (UniqueName: \"kubernetes.io/projected/5a156d77-af5f-4ff3-be63-279a07ea90f9-kube-api-access-fmqkq\") pod \"apiserver-7bbb656c7d-nct46\" (UID: \"5a156d77-af5f-4ff3-be63-279a07ea90f9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.321801 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf879bc2-82c1-4794-b3e6-5c7f3a238b47-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jgvqx\" (UID: \"bf879bc2-82c1-4794-b3e6-5c7f3a238b47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.328735 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.352030 4697 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.369340 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.392148 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.408596 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.429243 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.449161 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.455553 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.462140 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qj8gc"] Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.468444 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.475475 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t7pft" Feb 20 16:34:03 crc kubenswrapper[4697]: W0220 16:34:03.475969 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb46bb40b_85f2_458c_9bd2_7ebbb3d58bdf.slice/crio-a7dda71c05189dcb28011609732a41c9cac9bb2873c6dbecaf5510385926f672 WatchSource:0}: Error finding container a7dda71c05189dcb28011609732a41c9cac9bb2873c6dbecaf5510385926f672: Status 404 returned error can't find the container with id a7dda71c05189dcb28011609732a41c9cac9bb2873c6dbecaf5510385926f672 Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.483988 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.487137 4697 request.go:700] Waited for 1.89359526s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dnode-bootstrapper-token&limit=500&resourceVersion=0 Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.489299 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.494561 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.509185 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.516878 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.558200 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qm6jb\" (UID: \"ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.573172 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5869f1e-9909-4326-a2be-9e363563c3d0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5dstl\" (UID: \"a5869f1e-9909-4326-a2be-9e363563c3d0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.591452 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2xfj\" (UniqueName: \"kubernetes.io/projected/9c9b32a5-4d7e-423d-bacb-38148c5a0e38-kube-api-access-j2xfj\") pod \"machine-approver-56656f9798-5ctcg\" (UID: \"9c9b32a5-4d7e-423d-bacb-38148c5a0e38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.608220 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phqrt\" (UniqueName: \"kubernetes.io/projected/3e44b077-c323-48e8-be50-124f0a01d7d1-kube-api-access-phqrt\") pod \"route-controller-manager-6576b87f9c-qvhc2\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.630670 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.632297 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jr7c\" (UniqueName: \"kubernetes.io/projected/1b94e775-c465-4a54-858c-a3f82a6de290-kube-api-access-9jr7c\") pod \"openshift-apiserver-operator-796bbdcf4f-zp2wm\" (UID: \"1b94e775-c465-4a54-858c-a3f82a6de290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.645954 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.657131 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn65l\" (UniqueName: \"kubernetes.io/projected/a5869f1e-9909-4326-a2be-9e363563c3d0-kube-api-access-wn65l\") pod \"ingress-operator-5b745b69d9-5dstl\" (UID: \"a5869f1e-9909-4326-a2be-9e363563c3d0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.673274 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qj8gc" event={"ID":"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf","Type":"ContainerStarted","Data":"c5ad734c2c9feb5ad139883b1f9d19dee1772fe2e7f7afc83248bae209bd1f3c"} Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.673311 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qj8gc" event={"ID":"b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf","Type":"ContainerStarted","Data":"a7dda71c05189dcb28011609732a41c9cac9bb2873c6dbecaf5510385926f672"} Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.674164 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.674449 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.674486 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4795bb15-2193-45c4-80fe-fcb0e99580ca-serving-cert\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.674504 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.674520 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-encryption-config\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.674558 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-registry-tls\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.674575 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb22db1-79e1-4c5f-8853-ac893e454485-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qk7vb\" (UID: \"fdb22db1-79e1-4c5f-8853-ac893e454485\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.674594 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.674918 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-audit-dir\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.674939 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca43f4e8-ab22-4573-b5cb-5d58dbf788f1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f6d4m\" (UID: \"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.675098 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4795bb15-2193-45c4-80fe-fcb0e99580ca-config\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.675726 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-serving-cert\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676342 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b089c24-376b-4007-a8f0-7ead066569db-metrics-tls\") pod \"dns-operator-744455d44c-wsc22\" (UID: \"0b089c24-376b-4007-a8f0-7ead066569db\") " pod="openshift-dns-operator/dns-operator-744455d44c-wsc22" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676372 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676390 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676407 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/734eb893-5f04-4ae9-b45a-cd1ff030a1e8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-q4k9q\" (UID: \"734eb893-5f04-4ae9-b45a-cd1ff030a1e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676470 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-etcd-client\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676486 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-oauth-config\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676513 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx2v7\" (UniqueName: \"kubernetes.io/projected/04900cf7-d92a-4918-92ee-827f0a68f48c-kube-api-access-xx2v7\") pod \"package-server-manager-789f6589d5-jskqv\" (UID: \"04900cf7-d92a-4918-92ee-827f0a68f48c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676527 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-trusted-ca-bundle\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676543 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-audit-policies\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676559 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-image-import-ca\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676575 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-serving-cert\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676591 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ca43f4e8-ab22-4573-b5cb-5d58dbf788f1-images\") pod \"machine-api-operator-5694c8668f-f6d4m\" (UID: \"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676607 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z42pt\" (UniqueName: \"kubernetes.io/projected/70f9d3b5-82e4-47b2-ba65-88980dc9b401-kube-api-access-z42pt\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676627 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-audit\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676641 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f719af6-047e-435e-9946-6869136164e3-config\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676673 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3f719af6-047e-435e-9946-6869136164e3-etcd-ca\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676687 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-oauth-serving-cert\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676722 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676739 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676756 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba0f29fa-2dd7-4725-8455-ff12c8c4f121-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gr64d\" (UID: \"ba0f29fa-2dd7-4725-8455-ff12c8c4f121\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676770 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-service-ca\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676793 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3f719af6-047e-435e-9946-6869136164e3-etcd-client\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676808 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phd44\" (UniqueName: \"kubernetes.io/projected/734eb893-5f04-4ae9-b45a-cd1ff030a1e8-kube-api-access-phd44\") pod \"cluster-samples-operator-665b6dd947-q4k9q\" (UID: \"734eb893-5f04-4ae9-b45a-cd1ff030a1e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676824 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw685\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-kube-api-access-mw685\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676839 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-config\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676854 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d919170d-e25f-4a96-9503-edaf4c0c3c51-audit-dir\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676870 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676884 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676908 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf1acafc-f7fd-49ab-b574-9aed683db705-trusted-ca\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676923 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04900cf7-d92a-4918-92ee-827f0a68f48c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jskqv\" (UID: \"04900cf7-d92a-4918-92ee-827f0a68f48c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676939 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-node-pullsecrets\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676960 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.676985 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf1acafc-f7fd-49ab-b574-9aed683db705-installation-pull-secrets\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677001 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbkcm\" (UniqueName: \"kubernetes.io/projected/4795bb15-2193-45c4-80fe-fcb0e99580ca-kube-api-access-pbkcm\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677016 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677032 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca43f4e8-ab22-4573-b5cb-5d58dbf788f1-config\") pod \"machine-api-operator-5694c8668f-f6d4m\" (UID: \"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677046 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-bound-sa-token\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677064 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4795bb15-2193-45c4-80fe-fcb0e99580ca-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677086 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf1acafc-f7fd-49ab-b574-9aed683db705-ca-trust-extracted\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677101 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf1acafc-f7fd-49ab-b574-9aed683db705-registry-certificates\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677128 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677153 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb22db1-79e1-4c5f-8853-ac893e454485-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qk7vb\" (UID: \"fdb22db1-79e1-4c5f-8853-ac893e454485\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677171 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb7bv\" (UniqueName: \"kubernetes.io/projected/0b089c24-376b-4007-a8f0-7ead066569db-kube-api-access-hb7bv\") pod \"dns-operator-744455d44c-wsc22\" (UID: \"0b089c24-376b-4007-a8f0-7ead066569db\") " pod="openshift-dns-operator/dns-operator-744455d44c-wsc22" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677187 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba0f29fa-2dd7-4725-8455-ff12c8c4f121-proxy-tls\") pod \"machine-config-controller-84d6567774-gr64d\" (UID: \"ba0f29fa-2dd7-4725-8455-ff12c8c4f121\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677214 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677231 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68w5q\" (UniqueName: \"kubernetes.io/projected/3f719af6-047e-435e-9946-6869136164e3-kube-api-access-68w5q\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677253 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsmkf\" (UniqueName: \"kubernetes.io/projected/d919170d-e25f-4a96-9503-edaf4c0c3c51-kube-api-access-fsmkf\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677280 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3f719af6-047e-435e-9946-6869136164e3-etcd-service-ca\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677348 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4795bb15-2193-45c4-80fe-fcb0e99580ca-service-ca-bundle\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677369 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-config\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677392 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72shp\" (UniqueName: \"kubernetes.io/projected/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-kube-api-access-72shp\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677414 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94n97\" (UniqueName: \"kubernetes.io/projected/ca43f4e8-ab22-4573-b5cb-5d58dbf788f1-kube-api-access-94n97\") pod \"machine-api-operator-5694c8668f-f6d4m\" (UID: \"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677456 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsczt\" (UniqueName: \"kubernetes.io/projected/fdb22db1-79e1-4c5f-8853-ac893e454485-kube-api-access-jsczt\") pod \"openshift-controller-manager-operator-756b6f6bc6-qk7vb\" (UID: \"fdb22db1-79e1-4c5f-8853-ac893e454485\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677479 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-etcd-serving-ca\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677499 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f719af6-047e-435e-9946-6869136164e3-serving-cert\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677537 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlpbj\" (UniqueName: \"kubernetes.io/projected/ba0f29fa-2dd7-4725-8455-ff12c8c4f121-kube-api-access-xlpbj\") pod \"machine-config-controller-84d6567774-gr64d\" (UID: \"ba0f29fa-2dd7-4725-8455-ff12c8c4f121\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.677625 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" Feb 20 16:34:03 crc kubenswrapper[4697]: E0220 16:34:03.678048 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:04.178001581 +0000 UTC m=+151.958046999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.681832 4697 patch_prober.go:28] interesting pod/console-operator-58897d9998-qj8gc container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.681902 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qj8gc" podUID="b46bb40b-85f2-458c-9bd2-7ebbb3d58bdf" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.709863 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fp67j"] Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.772837 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.777080 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t7pft"] Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.777852 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778147 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9efcd507-8765-4845-8a65-3df282d78a69-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kz5km\" (UID: \"9efcd507-8765-4845-8a65-3df282d78a69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778175 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsczt\" (UniqueName: \"kubernetes.io/projected/fdb22db1-79e1-4c5f-8853-ac893e454485-kube-api-access-jsczt\") pod \"openshift-controller-manager-operator-756b6f6bc6-qk7vb\" (UID: \"fdb22db1-79e1-4c5f-8853-ac893e454485\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778193 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-service-ca-bundle\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778211 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcd3306-4d3d-484a-8aa6-cb90424cf206-config\") pod \"service-ca-operator-777779d784-g9d6x\" (UID: \"2bcd3306-4d3d-484a-8aa6-cb90424cf206\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778250 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dedc6ab-dd63-42b7-a4d8-35f9d867a546-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-d5xj5\" (UID: \"7dedc6ab-dd63-42b7-a4d8-35f9d867a546\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" Feb 20 16:34:03 crc kubenswrapper[4697]: E0220 16:34:03.778292 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:04.278251183 +0000 UTC m=+152.058296591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778341 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hzwp\" (UniqueName: \"kubernetes.io/projected/7dedc6ab-dd63-42b7-a4d8-35f9d867a546-kube-api-access-4hzwp\") pod \"kube-storage-version-migrator-operator-b67b599dd-d5xj5\" (UID: \"7dedc6ab-dd63-42b7-a4d8-35f9d867a546\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778392 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/adf825e0-c430-439f-9d7c-55b7582f1b54-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m6zk6\" (UID: \"adf825e0-c430-439f-9d7c-55b7582f1b54\") " pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778412 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0d62d0c-e4d0-4316-ae51-929ffb7295ac-metrics-tls\") pod \"dns-default-g58sg\" (UID: \"c0d62d0c-e4d0-4316-ae51-929ffb7295ac\") " pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778448 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94blt\" (UniqueName: \"kubernetes.io/projected/8912bf5c-045e-4c86-9a09-41c4cab10139-kube-api-access-94blt\") pod \"control-plane-machine-set-operator-78cbb6b69f-dkznt\" (UID: \"8912bf5c-045e-4c86-9a09-41c4cab10139\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778468 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-encryption-config\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778486 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c2tx\" (UniqueName: \"kubernetes.io/projected/95906860-29b3-49a2-b903-37b9a5a808a5-kube-api-access-2c2tx\") pod \"collect-profiles-29526750-77rx8\" (UID: \"95906860-29b3-49a2-b903-37b9a5a808a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778521 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4795bb15-2193-45c4-80fe-fcb0e99580ca-serving-cert\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778541 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-registry-tls\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778559 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb22db1-79e1-4c5f-8853-ac893e454485-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qk7vb\" (UID: \"fdb22db1-79e1-4c5f-8853-ac893e454485\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778578 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778606 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca43f4e8-ab22-4573-b5cb-5d58dbf788f1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f6d4m\" (UID: \"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778627 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-audit-dir\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778644 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4795bb15-2193-45c4-80fe-fcb0e99580ca-config\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778664 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e27abc7-011f-4e2f-bd37-98fc4ebc15dc-proxy-tls\") pod \"machine-config-operator-74547568cd-r4lhc\" (UID: \"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778690 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1dbb357b-a417-4661-b97c-272f1c7a5e0e-tmpfs\") pod \"packageserver-d55dfcdfc-xmjwk\" (UID: \"1dbb357b-a417-4661-b97c-272f1c7a5e0e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778723 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778744 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778761 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/734eb893-5f04-4ae9-b45a-cd1ff030a1e8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-q4k9q\" (UID: \"734eb893-5f04-4ae9-b45a-cd1ff030a1e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778782 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5g2w\" (UniqueName: \"kubernetes.io/projected/1e27abc7-011f-4e2f-bd37-98fc4ebc15dc-kube-api-access-p5g2w\") pod \"machine-config-operator-74547568cd-r4lhc\" (UID: \"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778797 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-socket-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778825 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-audit-policies\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778844 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-serving-cert\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778860 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt962\" (UniqueName: \"kubernetes.io/projected/234add05-4e79-4fd0-8412-1101f36476d0-kube-api-access-zt962\") pod \"olm-operator-6b444d44fb-7k8rk\" (UID: \"234add05-4e79-4fd0-8412-1101f36476d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778877 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-plugins-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778898 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ca43f4e8-ab22-4573-b5cb-5d58dbf788f1-images\") pod \"machine-api-operator-5694c8668f-f6d4m\" (UID: \"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778916 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z42pt\" (UniqueName: \"kubernetes.io/projected/70f9d3b5-82e4-47b2-ba65-88980dc9b401-kube-api-access-z42pt\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778932 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6b241e1-8a1e-4e61-b5ee-7b548b6301d9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f5mvc\" (UID: \"b6b241e1-8a1e-4e61-b5ee-7b548b6301d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778948 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-audit\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778966 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f719af6-047e-435e-9946-6869136164e3-config\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778980 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-default-certificate\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.778996 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3f719af6-047e-435e-9946-6869136164e3-etcd-ca\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779014 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9efcd507-8765-4845-8a65-3df282d78a69-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kz5km\" (UID: \"9efcd507-8765-4845-8a65-3df282d78a69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779051 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779065 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-service-ca\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779084 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b0966f26-8761-43ef-9d07-e6f0080885ec-signing-key\") pod \"service-ca-9c57cc56f-lztm2\" (UID: \"b0966f26-8761-43ef-9d07-e6f0080885ec\") " pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779103 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdm99\" (UniqueName: \"kubernetes.io/projected/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-kube-api-access-zdm99\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779121 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1dbb357b-a417-4661-b97c-272f1c7a5e0e-webhook-cert\") pod \"packageserver-d55dfcdfc-xmjwk\" (UID: \"1dbb357b-a417-4661-b97c-272f1c7a5e0e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779161 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-config\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779188 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d919170d-e25f-4a96-9503-edaf4c0c3c51-audit-dir\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779205 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779222 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9spzs\" (UniqueName: \"kubernetes.io/projected/c4f55cb5-9ae7-40fe-8bc9-8278f67bd4bf-kube-api-access-9spzs\") pod \"migrator-59844c95c7-8r2q4\" (UID: \"c4f55cb5-9ae7-40fe-8bc9-8278f67bd4bf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8r2q4" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779236 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-mountpoint-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779252 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-csi-data-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779795 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf1acafc-f7fd-49ab-b574-9aed683db705-trusted-ca\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779820 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04900cf7-d92a-4918-92ee-827f0a68f48c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jskqv\" (UID: \"04900cf7-d92a-4918-92ee-827f0a68f48c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779838 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcd3306-4d3d-484a-8aa6-cb90424cf206-serving-cert\") pod \"service-ca-operator-777779d784-g9d6x\" (UID: \"2bcd3306-4d3d-484a-8aa6-cb90424cf206\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779854 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca43f4e8-ab22-4573-b5cb-5d58dbf788f1-config\") pod \"machine-api-operator-5694c8668f-f6d4m\" (UID: \"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779871 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b241e1-8a1e-4e61-b5ee-7b548b6301d9-config\") pod \"kube-controller-manager-operator-78b949d7b-f5mvc\" (UID: \"b6b241e1-8a1e-4e61-b5ee-7b548b6301d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779903 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbkcm\" (UniqueName: \"kubernetes.io/projected/4795bb15-2193-45c4-80fe-fcb0e99580ca-kube-api-access-pbkcm\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779924 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779957 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-bound-sa-token\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779975 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4795bb15-2193-45c4-80fe-fcb0e99580ca-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.779995 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/382a81b1-8095-480d-9ead-e697f92dd700-cert\") pod \"ingress-canary-g6ltg\" (UID: \"382a81b1-8095-480d-9ead-e697f92dd700\") " pod="openshift-ingress-canary/ingress-canary-g6ltg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.780004 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ca43f4e8-ab22-4573-b5cb-5d58dbf788f1-images\") pod \"machine-api-operator-5694c8668f-f6d4m\" (UID: \"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.780020 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.780087 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb22db1-79e1-4c5f-8853-ac893e454485-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qk7vb\" (UID: \"fdb22db1-79e1-4c5f-8853-ac893e454485\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.780120 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e27abc7-011f-4e2f-bd37-98fc4ebc15dc-images\") pod \"machine-config-operator-74547568cd-r4lhc\" (UID: \"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.780147 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smxtt\" (UniqueName: \"kubernetes.io/projected/1dbb357b-a417-4661-b97c-272f1c7a5e0e-kube-api-access-smxtt\") pod \"packageserver-d55dfcdfc-xmjwk\" (UID: \"1dbb357b-a417-4661-b97c-272f1c7a5e0e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.780176 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb7bv\" (UniqueName: \"kubernetes.io/projected/0b089c24-376b-4007-a8f0-7ead066569db-kube-api-access-hb7bv\") pod \"dns-operator-744455d44c-wsc22\" (UID: \"0b089c24-376b-4007-a8f0-7ead066569db\") " pod="openshift-dns-operator/dns-operator-744455d44c-wsc22" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.780206 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba0f29fa-2dd7-4725-8455-ff12c8c4f121-proxy-tls\") pod \"machine-config-controller-84d6567774-gr64d\" (UID: \"ba0f29fa-2dd7-4725-8455-ff12c8c4f121\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.780229 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5f5d4dca-8439-49b5-8996-ff1587bc2230-srv-cert\") pod \"catalog-operator-68c6474976-6lsrp\" (UID: \"5f5d4dca-8439-49b5-8996-ff1587bc2230\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.780271 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68w5q\" (UniqueName: \"kubernetes.io/projected/3f719af6-047e-435e-9946-6869136164e3-kube-api-access-68w5q\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: E0220 16:34:03.780282 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:04.280267571 +0000 UTC m=+152.060313209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.780307 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f719af6-047e-435e-9946-6869136164e3-config\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.780850 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3f719af6-047e-435e-9946-6869136164e3-etcd-ca\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.780947 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb22db1-79e1-4c5f-8853-ac893e454485-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qk7vb\" (UID: \"fdb22db1-79e1-4c5f-8853-ac893e454485\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.782135 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-audit\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.782152 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4795bb15-2193-45c4-80fe-fcb0e99580ca-config\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.782930 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-service-ca\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.783321 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d919170d-e25f-4a96-9503-edaf4c0c3c51-audit-dir\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.790212 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-config\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.790897 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba0f29fa-2dd7-4725-8455-ff12c8c4f121-proxy-tls\") pod \"machine-config-controller-84d6567774-gr64d\" (UID: \"ba0f29fa-2dd7-4725-8455-ff12c8c4f121\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.792356 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4795bb15-2193-45c4-80fe-fcb0e99580ca-serving-cert\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.792534 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-audit-dir\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.796306 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.796919 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-serving-cert\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.808603 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-registry-tls\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.813240 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx"] Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.813734 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.815228 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-audit-policies\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.816251 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca43f4e8-ab22-4573-b5cb-5d58dbf788f1-config\") pod \"machine-api-operator-5694c8668f-f6d4m\" (UID: \"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.817760 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4795bb15-2193-45c4-80fe-fcb0e99580ca-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.821520 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf1acafc-f7fd-49ab-b574-9aed683db705-trusted-ca\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.821585 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3f719af6-047e-435e-9946-6869136164e3-etcd-service-ca\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.822167 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.824064 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3f719af6-047e-435e-9946-6869136164e3-etcd-service-ca\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.825388 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95906860-29b3-49a2-b903-37b9a5a808a5-config-volume\") pod \"collect-profiles-29526750-77rx8\" (UID: \"95906860-29b3-49a2-b903-37b9a5a808a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.825482 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95906860-29b3-49a2-b903-37b9a5a808a5-secret-volume\") pod \"collect-profiles-29526750-77rx8\" (UID: \"95906860-29b3-49a2-b903-37b9a5a808a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.825523 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-metrics-certs\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.825600 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e27abc7-011f-4e2f-bd37-98fc4ebc15dc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-r4lhc\" (UID: \"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.825622 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/234add05-4e79-4fd0-8412-1101f36476d0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7k8rk\" (UID: \"234add05-4e79-4fd0-8412-1101f36476d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.825717 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-etcd-serving-ca\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.825790 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f719af6-047e-435e-9946-6869136164e3-serving-cert\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.828311 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-etcd-serving-ca\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.828939 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.831230 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlpbj\" (UniqueName: \"kubernetes.io/projected/ba0f29fa-2dd7-4725-8455-ff12c8c4f121-kube-api-access-xlpbj\") pod \"machine-config-controller-84d6567774-gr64d\" (UID: \"ba0f29fa-2dd7-4725-8455-ff12c8c4f121\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.832968 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.833188 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbmjh\" (UniqueName: \"kubernetes.io/projected/47c55b18-d2e2-4b8a-a976-6955bc13425b-kube-api-access-pbmjh\") pod \"multus-admission-controller-857f4d67dd-6ct9m\" (UID: \"47c55b18-d2e2-4b8a-a976-6955bc13425b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ct9m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.833278 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.833312 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.833350 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-registration-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.833478 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8912bf5c-045e-4c86-9a09-41c4cab10139-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dkznt\" (UID: \"8912bf5c-045e-4c86-9a09-41c4cab10139\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.833503 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdbwd\" (UniqueName: \"kubernetes.io/projected/0ae20fce-02da-42af-92de-7f3d280732df-kube-api-access-vdbwd\") pod \"machine-config-server-wljw5\" (UID: \"0ae20fce-02da-42af-92de-7f3d280732df\") " pod="openshift-machine-config-operator/machine-config-server-wljw5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.833582 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-serving-cert\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.833606 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vm4g\" (UniqueName: \"kubernetes.io/projected/382a81b1-8095-480d-9ead-e697f92dd700-kube-api-access-4vm4g\") pod \"ingress-canary-g6ltg\" (UID: \"382a81b1-8095-480d-9ead-e697f92dd700\") " pod="openshift-ingress-canary/ingress-canary-g6ltg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.833626 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0ae20fce-02da-42af-92de-7f3d280732df-certs\") pod \"machine-config-server-wljw5\" (UID: \"0ae20fce-02da-42af-92de-7f3d280732df\") " pod="openshift-machine-config-operator/machine-config-server-wljw5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.833654 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b089c24-376b-4007-a8f0-7ead066569db-metrics-tls\") pod \"dns-operator-744455d44c-wsc22\" (UID: \"0b089c24-376b-4007-a8f0-7ead066569db\") " pod="openshift-dns-operator/dns-operator-744455d44c-wsc22" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.835781 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb22db1-79e1-4c5f-8853-ac893e454485-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qk7vb\" (UID: \"fdb22db1-79e1-4c5f-8853-ac893e454485\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.836111 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f719af6-047e-435e-9946-6869136164e3-serving-cert\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.836722 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04900cf7-d92a-4918-92ee-827f0a68f48c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jskqv\" (UID: \"04900cf7-d92a-4918-92ee-827f0a68f48c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.836786 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca43f4e8-ab22-4573-b5cb-5d58dbf788f1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f6d4m\" (UID: \"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.836121 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-encryption-config\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.838249 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.840791 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/734eb893-5f04-4ae9-b45a-cd1ff030a1e8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-q4k9q\" (UID: \"734eb893-5f04-4ae9-b45a-cd1ff030a1e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.841117 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58rbg\" (UniqueName: \"kubernetes.io/projected/c0d62d0c-e4d0-4316-ae51-929ffb7295ac-kube-api-access-58rbg\") pod \"dns-default-g58sg\" (UID: \"c0d62d0c-e4d0-4316-ae51-929ffb7295ac\") " pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.841156 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-etcd-client\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.841189 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-oauth-config\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.841753 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmc2f\" (UniqueName: \"kubernetes.io/projected/adf825e0-c430-439f-9d7c-55b7582f1b54-kube-api-access-gmc2f\") pod \"marketplace-operator-79b997595-m6zk6\" (UID: \"adf825e0-c430-439f-9d7c-55b7582f1b54\") " pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.841809 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx2v7\" (UniqueName: \"kubernetes.io/projected/04900cf7-d92a-4918-92ee-827f0a68f48c-kube-api-access-xx2v7\") pod \"package-server-manager-789f6589d5-jskqv\" (UID: \"04900cf7-d92a-4918-92ee-827f0a68f48c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.841837 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-trusted-ca-bundle\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.841861 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5f5d4dca-8439-49b5-8996-ff1587bc2230-profile-collector-cert\") pod \"catalog-operator-68c6474976-6lsrp\" (UID: \"5f5d4dca-8439-49b5-8996-ff1587bc2230\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.843367 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-serving-cert\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.845107 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-image-import-ca\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.845549 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbbhd\" (UniqueName: \"kubernetes.io/projected/5f5d4dca-8439-49b5-8996-ff1587bc2230-kube-api-access-hbbhd\") pod \"catalog-operator-68c6474976-6lsrp\" (UID: \"5f5d4dca-8439-49b5-8996-ff1587bc2230\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.845662 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-oauth-serving-cert\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.845707 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/47c55b18-d2e2-4b8a-a976-6955bc13425b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6ct9m\" (UID: \"47c55b18-d2e2-4b8a-a976-6955bc13425b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ct9m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.845819 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.845855 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba0f29fa-2dd7-4725-8455-ff12c8c4f121-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gr64d\" (UID: \"ba0f29fa-2dd7-4725-8455-ff12c8c4f121\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.845894 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0d62d0c-e4d0-4316-ae51-929ffb7295ac-config-volume\") pod \"dns-default-g58sg\" (UID: \"c0d62d0c-e4d0-4316-ae51-929ffb7295ac\") " pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.846749 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-oauth-serving-cert\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.848094 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-trusted-ca-bundle\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.848637 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3f719af6-047e-435e-9946-6869136164e3-etcd-client\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.848737 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phd44\" (UniqueName: \"kubernetes.io/projected/734eb893-5f04-4ae9-b45a-cd1ff030a1e8-kube-api-access-phd44\") pod \"cluster-samples-operator-665b6dd947-q4k9q\" (UID: \"734eb893-5f04-4ae9-b45a-cd1ff030a1e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849101 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw685\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-kube-api-access-mw685\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849163 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849187 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849251 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-node-pullsecrets\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849290 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b0966f26-8761-43ef-9d07-e6f0080885ec-signing-cabundle\") pod \"service-ca-9c57cc56f-lztm2\" (UID: \"b0966f26-8761-43ef-9d07-e6f0080885ec\") " pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849313 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf1acafc-f7fd-49ab-b574-9aed683db705-installation-pull-secrets\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849380 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0ae20fce-02da-42af-92de-7f3d280732df-node-bootstrap-token\") pod \"machine-config-server-wljw5\" (UID: \"0ae20fce-02da-42af-92de-7f3d280732df\") " pod="openshift-machine-config-operator/machine-config-server-wljw5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849360 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba0f29fa-2dd7-4725-8455-ff12c8c4f121-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gr64d\" (UID: \"ba0f29fa-2dd7-4725-8455-ff12c8c4f121\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849459 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6b241e1-8a1e-4e61-b5ee-7b548b6301d9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f5mvc\" (UID: \"b6b241e1-8a1e-4e61-b5ee-7b548b6301d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849493 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/234add05-4e79-4fd0-8412-1101f36476d0-srv-cert\") pod \"olm-operator-6b444d44fb-7k8rk\" (UID: \"234add05-4e79-4fd0-8412-1101f36476d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849554 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf1acafc-f7fd-49ab-b574-9aed683db705-ca-trust-extracted\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849582 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf1acafc-f7fd-49ab-b574-9aed683db705-registry-certificates\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849607 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adf825e0-c430-439f-9d7c-55b7582f1b54-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m6zk6\" (UID: \"adf825e0-c430-439f-9d7c-55b7582f1b54\") " pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849632 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-stats-auth\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849676 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkz58\" (UniqueName: \"kubernetes.io/projected/b0966f26-8761-43ef-9d07-e6f0080885ec-kube-api-access-kkz58\") pod \"service-ca-9c57cc56f-lztm2\" (UID: \"b0966f26-8761-43ef-9d07-e6f0080885ec\") " pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849695 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dedc6ab-dd63-42b7-a4d8-35f9d867a546-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-d5xj5\" (UID: \"7dedc6ab-dd63-42b7-a4d8-35f9d867a546\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849737 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc9hx\" (UniqueName: \"kubernetes.io/projected/eda7a295-93d7-4a47-915b-ba8b58bfb325-kube-api-access-vc9hx\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849840 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849903 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsmkf\" (UniqueName: \"kubernetes.io/projected/d919170d-e25f-4a96-9503-edaf4c0c3c51-kube-api-access-fsmkf\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849930 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9efcd507-8765-4845-8a65-3df282d78a69-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kz5km\" (UID: \"9efcd507-8765-4845-8a65-3df282d78a69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.849957 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4795bb15-2193-45c4-80fe-fcb0e99580ca-service-ca-bundle\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.850349 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-config\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.850379 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1dbb357b-a417-4661-b97c-272f1c7a5e0e-apiservice-cert\") pod \"packageserver-d55dfcdfc-xmjwk\" (UID: \"1dbb357b-a417-4661-b97c-272f1c7a5e0e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.850403 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72shp\" (UniqueName: \"kubernetes.io/projected/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-kube-api-access-72shp\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.850456 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94n97\" (UniqueName: \"kubernetes.io/projected/ca43f4e8-ab22-4573-b5cb-5d58dbf788f1-kube-api-access-94n97\") pod \"machine-api-operator-5694c8668f-f6d4m\" (UID: \"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.850487 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4f7p\" (UniqueName: \"kubernetes.io/projected/2bcd3306-4d3d-484a-8aa6-cb90424cf206-kube-api-access-g4f7p\") pod \"service-ca-operator-777779d784-g9d6x\" (UID: \"2bcd3306-4d3d-484a-8aa6-cb90424cf206\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.862173 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsczt\" (UniqueName: \"kubernetes.io/projected/fdb22db1-79e1-4c5f-8853-ac893e454485-kube-api-access-jsczt\") pod \"openshift-controller-manager-operator-756b6f6bc6-qk7vb\" (UID: \"fdb22db1-79e1-4c5f-8853-ac893e454485\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.862607 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-image-import-ca\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.865739 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv"] Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.866099 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-etcd-client\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.867390 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf1acafc-f7fd-49ab-b574-9aed683db705-registry-certificates\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.868279 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf1acafc-f7fd-49ab-b574-9aed683db705-ca-trust-extracted\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.868808 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46"] Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.869001 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b089c24-376b-4007-a8f0-7ead066569db-metrics-tls\") pod \"dns-operator-744455d44c-wsc22\" (UID: \"0b089c24-376b-4007-a8f0-7ead066569db\") " pod="openshift-dns-operator/dns-operator-744455d44c-wsc22" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.869305 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-oauth-config\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.869533 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.870769 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.871068 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.871225 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3f719af6-047e-435e-9946-6869136164e3-etcd-client\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.871803 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-config\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.873553 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf1acafc-f7fd-49ab-b574-9aed683db705-installation-pull-secrets\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.873956 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4795bb15-2193-45c4-80fe-fcb0e99580ca-service-ca-bundle\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.875794 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z42pt\" (UniqueName: \"kubernetes.io/projected/70f9d3b5-82e4-47b2-ba65-88980dc9b401-kube-api-access-z42pt\") pod \"console-f9d7485db-txsqk\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.876370 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.877997 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.878338 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb7bv\" (UniqueName: \"kubernetes.io/projected/0b089c24-376b-4007-a8f0-7ead066569db-kube-api-access-hb7bv\") pod \"dns-operator-744455d44c-wsc22\" (UID: \"0b089c24-376b-4007-a8f0-7ead066569db\") " pod="openshift-dns-operator/dns-operator-744455d44c-wsc22" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.861420 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-node-pullsecrets\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.889061 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.906892 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbkcm\" (UniqueName: \"kubernetes.io/projected/4795bb15-2193-45c4-80fe-fcb0e99580ca-kube-api-access-pbkcm\") pod \"authentication-operator-69f744f599-cm59t\" (UID: \"4795bb15-2193-45c4-80fe-fcb0e99580ca\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.920761 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.924185 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68w5q\" (UniqueName: \"kubernetes.io/projected/3f719af6-047e-435e-9946-6869136164e3-kube-api-access-68w5q\") pod \"etcd-operator-b45778765-kwgmg\" (UID: \"3f719af6-047e-435e-9946-6869136164e3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.938542 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.948214 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-bound-sa-token\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.952350 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.952833 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9spzs\" (UniqueName: \"kubernetes.io/projected/c4f55cb5-9ae7-40fe-8bc9-8278f67bd4bf-kube-api-access-9spzs\") pod \"migrator-59844c95c7-8r2q4\" (UID: \"c4f55cb5-9ae7-40fe-8bc9-8278f67bd4bf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8r2q4" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.952863 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-mountpoint-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.952881 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-csi-data-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.952898 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcd3306-4d3d-484a-8aa6-cb90424cf206-serving-cert\") pod \"service-ca-operator-777779d784-g9d6x\" (UID: \"2bcd3306-4d3d-484a-8aa6-cb90424cf206\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.952922 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b241e1-8a1e-4e61-b5ee-7b548b6301d9-config\") pod \"kube-controller-manager-operator-78b949d7b-f5mvc\" (UID: \"b6b241e1-8a1e-4e61-b5ee-7b548b6301d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.952955 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/382a81b1-8095-480d-9ead-e697f92dd700-cert\") pod \"ingress-canary-g6ltg\" (UID: \"382a81b1-8095-480d-9ead-e697f92dd700\") " pod="openshift-ingress-canary/ingress-canary-g6ltg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.952972 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e27abc7-011f-4e2f-bd37-98fc4ebc15dc-images\") pod \"machine-config-operator-74547568cd-r4lhc\" (UID: \"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.952988 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smxtt\" (UniqueName: \"kubernetes.io/projected/1dbb357b-a417-4661-b97c-272f1c7a5e0e-kube-api-access-smxtt\") pod \"packageserver-d55dfcdfc-xmjwk\" (UID: \"1dbb357b-a417-4661-b97c-272f1c7a5e0e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953006 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5f5d4dca-8439-49b5-8996-ff1587bc2230-srv-cert\") pod \"catalog-operator-68c6474976-6lsrp\" (UID: \"5f5d4dca-8439-49b5-8996-ff1587bc2230\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953027 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95906860-29b3-49a2-b903-37b9a5a808a5-config-volume\") pod \"collect-profiles-29526750-77rx8\" (UID: \"95906860-29b3-49a2-b903-37b9a5a808a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953042 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95906860-29b3-49a2-b903-37b9a5a808a5-secret-volume\") pod \"collect-profiles-29526750-77rx8\" (UID: \"95906860-29b3-49a2-b903-37b9a5a808a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953059 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-metrics-certs\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953074 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e27abc7-011f-4e2f-bd37-98fc4ebc15dc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-r4lhc\" (UID: \"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953093 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/234add05-4e79-4fd0-8412-1101f36476d0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7k8rk\" (UID: \"234add05-4e79-4fd0-8412-1101f36476d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953118 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbmjh\" (UniqueName: \"kubernetes.io/projected/47c55b18-d2e2-4b8a-a976-6955bc13425b-kube-api-access-pbmjh\") pod \"multus-admission-controller-857f4d67dd-6ct9m\" (UID: \"47c55b18-d2e2-4b8a-a976-6955bc13425b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ct9m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953137 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-registration-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953155 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8912bf5c-045e-4c86-9a09-41c4cab10139-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dkznt\" (UID: \"8912bf5c-045e-4c86-9a09-41c4cab10139\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953174 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdbwd\" (UniqueName: \"kubernetes.io/projected/0ae20fce-02da-42af-92de-7f3d280732df-kube-api-access-vdbwd\") pod \"machine-config-server-wljw5\" (UID: \"0ae20fce-02da-42af-92de-7f3d280732df\") " pod="openshift-machine-config-operator/machine-config-server-wljw5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953193 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vm4g\" (UniqueName: \"kubernetes.io/projected/382a81b1-8095-480d-9ead-e697f92dd700-kube-api-access-4vm4g\") pod \"ingress-canary-g6ltg\" (UID: \"382a81b1-8095-480d-9ead-e697f92dd700\") " pod="openshift-ingress-canary/ingress-canary-g6ltg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953212 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0ae20fce-02da-42af-92de-7f3d280732df-certs\") pod \"machine-config-server-wljw5\" (UID: \"0ae20fce-02da-42af-92de-7f3d280732df\") " pod="openshift-machine-config-operator/machine-config-server-wljw5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953231 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmc2f\" (UniqueName: \"kubernetes.io/projected/adf825e0-c430-439f-9d7c-55b7582f1b54-kube-api-access-gmc2f\") pod \"marketplace-operator-79b997595-m6zk6\" (UID: \"adf825e0-c430-439f-9d7c-55b7582f1b54\") " pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953249 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58rbg\" (UniqueName: \"kubernetes.io/projected/c0d62d0c-e4d0-4316-ae51-929ffb7295ac-kube-api-access-58rbg\") pod \"dns-default-g58sg\" (UID: \"c0d62d0c-e4d0-4316-ae51-929ffb7295ac\") " pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953281 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5f5d4dca-8439-49b5-8996-ff1587bc2230-profile-collector-cert\") pod \"catalog-operator-68c6474976-6lsrp\" (UID: \"5f5d4dca-8439-49b5-8996-ff1587bc2230\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953299 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbhd\" (UniqueName: \"kubernetes.io/projected/5f5d4dca-8439-49b5-8996-ff1587bc2230-kube-api-access-hbbhd\") pod \"catalog-operator-68c6474976-6lsrp\" (UID: \"5f5d4dca-8439-49b5-8996-ff1587bc2230\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953316 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/47c55b18-d2e2-4b8a-a976-6955bc13425b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6ct9m\" (UID: \"47c55b18-d2e2-4b8a-a976-6955bc13425b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ct9m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953331 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0d62d0c-e4d0-4316-ae51-929ffb7295ac-config-volume\") pod \"dns-default-g58sg\" (UID: \"c0d62d0c-e4d0-4316-ae51-929ffb7295ac\") " pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953358 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b0966f26-8761-43ef-9d07-e6f0080885ec-signing-cabundle\") pod \"service-ca-9c57cc56f-lztm2\" (UID: \"b0966f26-8761-43ef-9d07-e6f0080885ec\") " pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953379 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6b241e1-8a1e-4e61-b5ee-7b548b6301d9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f5mvc\" (UID: \"b6b241e1-8a1e-4e61-b5ee-7b548b6301d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953394 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0ae20fce-02da-42af-92de-7f3d280732df-node-bootstrap-token\") pod \"machine-config-server-wljw5\" (UID: \"0ae20fce-02da-42af-92de-7f3d280732df\") " pod="openshift-machine-config-operator/machine-config-server-wljw5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953410 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adf825e0-c430-439f-9d7c-55b7582f1b54-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m6zk6\" (UID: \"adf825e0-c430-439f-9d7c-55b7582f1b54\") " pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953427 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-stats-auth\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953455 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/234add05-4e79-4fd0-8412-1101f36476d0-srv-cert\") pod \"olm-operator-6b444d44fb-7k8rk\" (UID: \"234add05-4e79-4fd0-8412-1101f36476d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953483 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkz58\" (UniqueName: \"kubernetes.io/projected/b0966f26-8761-43ef-9d07-e6f0080885ec-kube-api-access-kkz58\") pod \"service-ca-9c57cc56f-lztm2\" (UID: \"b0966f26-8761-43ef-9d07-e6f0080885ec\") " pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953498 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dedc6ab-dd63-42b7-a4d8-35f9d867a546-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-d5xj5\" (UID: \"7dedc6ab-dd63-42b7-a4d8-35f9d867a546\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953513 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc9hx\" (UniqueName: \"kubernetes.io/projected/eda7a295-93d7-4a47-915b-ba8b58bfb325-kube-api-access-vc9hx\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953538 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1dbb357b-a417-4661-b97c-272f1c7a5e0e-apiservice-cert\") pod \"packageserver-d55dfcdfc-xmjwk\" (UID: \"1dbb357b-a417-4661-b97c-272f1c7a5e0e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953555 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9efcd507-8765-4845-8a65-3df282d78a69-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kz5km\" (UID: \"9efcd507-8765-4845-8a65-3df282d78a69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953584 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4f7p\" (UniqueName: \"kubernetes.io/projected/2bcd3306-4d3d-484a-8aa6-cb90424cf206-kube-api-access-g4f7p\") pod \"service-ca-operator-777779d784-g9d6x\" (UID: \"2bcd3306-4d3d-484a-8aa6-cb90424cf206\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953601 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-service-ca-bundle\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953618 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9efcd507-8765-4845-8a65-3df282d78a69-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kz5km\" (UID: \"9efcd507-8765-4845-8a65-3df282d78a69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953636 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dedc6ab-dd63-42b7-a4d8-35f9d867a546-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-d5xj5\" (UID: \"7dedc6ab-dd63-42b7-a4d8-35f9d867a546\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953651 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcd3306-4d3d-484a-8aa6-cb90424cf206-config\") pod \"service-ca-operator-777779d784-g9d6x\" (UID: \"2bcd3306-4d3d-484a-8aa6-cb90424cf206\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953668 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/adf825e0-c430-439f-9d7c-55b7582f1b54-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m6zk6\" (UID: \"adf825e0-c430-439f-9d7c-55b7582f1b54\") " pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953685 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0d62d0c-e4d0-4316-ae51-929ffb7295ac-metrics-tls\") pod \"dns-default-g58sg\" (UID: \"c0d62d0c-e4d0-4316-ae51-929ffb7295ac\") " pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953703 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94blt\" (UniqueName: \"kubernetes.io/projected/8912bf5c-045e-4c86-9a09-41c4cab10139-kube-api-access-94blt\") pod \"control-plane-machine-set-operator-78cbb6b69f-dkznt\" (UID: \"8912bf5c-045e-4c86-9a09-41c4cab10139\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953721 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hzwp\" (UniqueName: \"kubernetes.io/projected/7dedc6ab-dd63-42b7-a4d8-35f9d867a546-kube-api-access-4hzwp\") pod \"kube-storage-version-migrator-operator-b67b599dd-d5xj5\" (UID: \"7dedc6ab-dd63-42b7-a4d8-35f9d867a546\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953747 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c2tx\" (UniqueName: \"kubernetes.io/projected/95906860-29b3-49a2-b903-37b9a5a808a5-kube-api-access-2c2tx\") pod \"collect-profiles-29526750-77rx8\" (UID: \"95906860-29b3-49a2-b903-37b9a5a808a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953766 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e27abc7-011f-4e2f-bd37-98fc4ebc15dc-proxy-tls\") pod \"machine-config-operator-74547568cd-r4lhc\" (UID: \"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953785 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1dbb357b-a417-4661-b97c-272f1c7a5e0e-tmpfs\") pod \"packageserver-d55dfcdfc-xmjwk\" (UID: \"1dbb357b-a417-4661-b97c-272f1c7a5e0e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953804 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5g2w\" (UniqueName: \"kubernetes.io/projected/1e27abc7-011f-4e2f-bd37-98fc4ebc15dc-kube-api-access-p5g2w\") pod \"machine-config-operator-74547568cd-r4lhc\" (UID: \"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953820 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-socket-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953838 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt962\" (UniqueName: \"kubernetes.io/projected/234add05-4e79-4fd0-8412-1101f36476d0-kube-api-access-zt962\") pod \"olm-operator-6b444d44fb-7k8rk\" (UID: \"234add05-4e79-4fd0-8412-1101f36476d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953859 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6b241e1-8a1e-4e61-b5ee-7b548b6301d9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f5mvc\" (UID: \"b6b241e1-8a1e-4e61-b5ee-7b548b6301d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953874 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-plugins-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953890 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-default-certificate\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953907 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9efcd507-8765-4845-8a65-3df282d78a69-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kz5km\" (UID: \"9efcd507-8765-4845-8a65-3df282d78a69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953925 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b0966f26-8761-43ef-9d07-e6f0080885ec-signing-key\") pod \"service-ca-9c57cc56f-lztm2\" (UID: \"b0966f26-8761-43ef-9d07-e6f0080885ec\") " pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953943 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdm99\" (UniqueName: \"kubernetes.io/projected/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-kube-api-access-zdm99\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.953960 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1dbb357b-a417-4661-b97c-272f1c7a5e0e-webhook-cert\") pod \"packageserver-d55dfcdfc-xmjwk\" (UID: \"1dbb357b-a417-4661-b97c-272f1c7a5e0e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.958522 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-registration-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.958638 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1dbb357b-a417-4661-b97c-272f1c7a5e0e-tmpfs\") pod \"packageserver-d55dfcdfc-xmjwk\" (UID: \"1dbb357b-a417-4661-b97c-272f1c7a5e0e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.958772 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-socket-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.959655 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9efcd507-8765-4845-8a65-3df282d78a69-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kz5km\" (UID: \"9efcd507-8765-4845-8a65-3df282d78a69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.959704 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-plugins-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.961971 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-service-ca-bundle\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.962009 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1dbb357b-a417-4661-b97c-272f1c7a5e0e-webhook-cert\") pod \"packageserver-d55dfcdfc-xmjwk\" (UID: \"1dbb357b-a417-4661-b97c-272f1c7a5e0e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.962257 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-mountpoint-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: E0220 16:34:03.962346 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:04.462317417 +0000 UTC m=+152.242362825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.962764 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9efcd507-8765-4845-8a65-3df282d78a69-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kz5km\" (UID: \"9efcd507-8765-4845-8a65-3df282d78a69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.962826 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0d62d0c-e4d0-4316-ae51-929ffb7295ac-config-volume\") pod \"dns-default-g58sg\" (UID: \"c0d62d0c-e4d0-4316-ae51-929ffb7295ac\") " pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.963279 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcd3306-4d3d-484a-8aa6-cb90424cf206-config\") pod \"service-ca-operator-777779d784-g9d6x\" (UID: \"2bcd3306-4d3d-484a-8aa6-cb90424cf206\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.963493 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dedc6ab-dd63-42b7-a4d8-35f9d867a546-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-d5xj5\" (UID: \"7dedc6ab-dd63-42b7-a4d8-35f9d867a546\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.963583 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b0966f26-8761-43ef-9d07-e6f0080885ec-signing-cabundle\") pod \"service-ca-9c57cc56f-lztm2\" (UID: \"b0966f26-8761-43ef-9d07-e6f0080885ec\") " pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.964180 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0d62d0c-e4d0-4316-ae51-929ffb7295ac-metrics-tls\") pod \"dns-default-g58sg\" (UID: \"c0d62d0c-e4d0-4316-ae51-929ffb7295ac\") " pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.964835 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e27abc7-011f-4e2f-bd37-98fc4ebc15dc-proxy-tls\") pod \"machine-config-operator-74547568cd-r4lhc\" (UID: \"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.965136 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/234add05-4e79-4fd0-8412-1101f36476d0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7k8rk\" (UID: \"234add05-4e79-4fd0-8412-1101f36476d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.965597 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-default-certificate\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.966041 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/eda7a295-93d7-4a47-915b-ba8b58bfb325-csi-data-dir\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.966049 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95906860-29b3-49a2-b903-37b9a5a808a5-config-volume\") pod \"collect-profiles-29526750-77rx8\" (UID: \"95906860-29b3-49a2-b903-37b9a5a808a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.966116 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1dbb357b-a417-4661-b97c-272f1c7a5e0e-apiservice-cert\") pod \"packageserver-d55dfcdfc-xmjwk\" (UID: \"1dbb357b-a417-4661-b97c-272f1c7a5e0e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.966149 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adf825e0-c430-439f-9d7c-55b7582f1b54-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m6zk6\" (UID: \"adf825e0-c430-439f-9d7c-55b7582f1b54\") " pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.966724 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlpbj\" (UniqueName: \"kubernetes.io/projected/ba0f29fa-2dd7-4725-8455-ff12c8c4f121-kube-api-access-xlpbj\") pod \"machine-config-controller-84d6567774-gr64d\" (UID: \"ba0f29fa-2dd7-4725-8455-ff12c8c4f121\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.967112 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b241e1-8a1e-4e61-b5ee-7b548b6301d9-config\") pod \"kube-controller-manager-operator-78b949d7b-f5mvc\" (UID: \"b6b241e1-8a1e-4e61-b5ee-7b548b6301d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.967696 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e27abc7-011f-4e2f-bd37-98fc4ebc15dc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-r4lhc\" (UID: \"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.968658 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e27abc7-011f-4e2f-bd37-98fc4ebc15dc-images\") pod \"machine-config-operator-74547568cd-r4lhc\" (UID: \"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.969338 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0ae20fce-02da-42af-92de-7f3d280732df-node-bootstrap-token\") pod \"machine-config-server-wljw5\" (UID: \"0ae20fce-02da-42af-92de-7f3d280732df\") " pod="openshift-machine-config-operator/machine-config-server-wljw5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.970411 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/adf825e0-c430-439f-9d7c-55b7582f1b54-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m6zk6\" (UID: \"adf825e0-c430-439f-9d7c-55b7582f1b54\") " pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.971660 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0ae20fce-02da-42af-92de-7f3d280732df-certs\") pod \"machine-config-server-wljw5\" (UID: \"0ae20fce-02da-42af-92de-7f3d280732df\") " pod="openshift-machine-config-operator/machine-config-server-wljw5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.971919 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/47c55b18-d2e2-4b8a-a976-6955bc13425b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6ct9m\" (UID: \"47c55b18-d2e2-4b8a-a976-6955bc13425b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ct9m" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.973525 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95906860-29b3-49a2-b903-37b9a5a808a5-secret-volume\") pod \"collect-profiles-29526750-77rx8\" (UID: \"95906860-29b3-49a2-b903-37b9a5a808a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.973550 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/234add05-4e79-4fd0-8412-1101f36476d0-srv-cert\") pod \"olm-operator-6b444d44fb-7k8rk\" (UID: \"234add05-4e79-4fd0-8412-1101f36476d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.974136 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/382a81b1-8095-480d-9ead-e697f92dd700-cert\") pod \"ingress-canary-g6ltg\" (UID: \"382a81b1-8095-480d-9ead-e697f92dd700\") " pod="openshift-ingress-canary/ingress-canary-g6ltg" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.975240 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6b241e1-8a1e-4e61-b5ee-7b548b6301d9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f5mvc\" (UID: \"b6b241e1-8a1e-4e61-b5ee-7b548b6301d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.977845 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-metrics-certs\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.979414 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-stats-auth\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.981029 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5f5d4dca-8439-49b5-8996-ff1587bc2230-srv-cert\") pod \"catalog-operator-68c6474976-6lsrp\" (UID: \"5f5d4dca-8439-49b5-8996-ff1587bc2230\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.982200 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dedc6ab-dd63-42b7-a4d8-35f9d867a546-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-d5xj5\" (UID: \"7dedc6ab-dd63-42b7-a4d8-35f9d867a546\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.984241 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b0966f26-8761-43ef-9d07-e6f0080885ec-signing-key\") pod \"service-ca-9c57cc56f-lztm2\" (UID: \"b0966f26-8761-43ef-9d07-e6f0080885ec\") " pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.987129 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8912bf5c-045e-4c86-9a09-41c4cab10139-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dkznt\" (UID: \"8912bf5c-045e-4c86-9a09-41c4cab10139\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt" Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.987213 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2"] Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.988164 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb"] Feb 20 16:34:03 crc kubenswrapper[4697]: I0220 16:34:03.991617 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx2v7\" (UniqueName: \"kubernetes.io/projected/04900cf7-d92a-4918-92ee-827f0a68f48c-kube-api-access-xx2v7\") pod \"package-server-manager-789f6589d5-jskqv\" (UID: \"04900cf7-d92a-4918-92ee-827f0a68f48c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.001339 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcd3306-4d3d-484a-8aa6-cb90424cf206-serving-cert\") pod \"service-ca-operator-777779d784-g9d6x\" (UID: \"2bcd3306-4d3d-484a-8aa6-cb90424cf206\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.002402 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.008159 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phd44\" (UniqueName: \"kubernetes.io/projected/734eb893-5f04-4ae9-b45a-cd1ff030a1e8-kube-api-access-phd44\") pod \"cluster-samples-operator-665b6dd947-q4k9q\" (UID: \"734eb893-5f04-4ae9-b45a-cd1ff030a1e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.009263 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5f5d4dca-8439-49b5-8996-ff1587bc2230-profile-collector-cert\") pod \"catalog-operator-68c6474976-6lsrp\" (UID: \"5f5d4dca-8439-49b5-8996-ff1587bc2230\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.024950 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw685\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-kube-api-access-mw685\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.029944 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.044097 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.052815 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsmkf\" (UniqueName: \"kubernetes.io/projected/d919170d-e25f-4a96-9503-edaf4c0c3c51-kube-api-access-fsmkf\") pod \"oauth-openshift-558db77b4-m8rl6\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.055565 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:04 crc kubenswrapper[4697]: E0220 16:34:04.056640 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:04.556626358 +0000 UTC m=+152.336671756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.065917 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.081672 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94n97\" (UniqueName: \"kubernetes.io/projected/ca43f4e8-ab22-4573-b5cb-5d58dbf788f1-kube-api-access-94n97\") pod \"machine-api-operator-5694c8668f-f6d4m\" (UID: \"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.105280 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72shp\" (UniqueName: \"kubernetes.io/projected/535cb2ff-6eeb-467d-b67b-ed6c1f3a51be-kube-api-access-72shp\") pod \"apiserver-76f77b778f-mqvww\" (UID: \"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be\") " pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.132802 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94blt\" (UniqueName: \"kubernetes.io/projected/8912bf5c-045e-4c86-9a09-41c4cab10139-kube-api-access-94blt\") pod \"control-plane-machine-set-operator-78cbb6b69f-dkznt\" (UID: \"8912bf5c-045e-4c86-9a09-41c4cab10139\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.133695 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm"] Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.146685 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hzwp\" (UniqueName: \"kubernetes.io/projected/7dedc6ab-dd63-42b7-a4d8-35f9d867a546-kube-api-access-4hzwp\") pod \"kube-storage-version-migrator-operator-b67b599dd-d5xj5\" (UID: \"7dedc6ab-dd63-42b7-a4d8-35f9d867a546\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.159852 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:04 crc kubenswrapper[4697]: E0220 16:34:04.161089 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:04.661059694 +0000 UTC m=+152.441105102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.162208 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:04 crc kubenswrapper[4697]: E0220 16:34:04.162727 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:04.662712798 +0000 UTC m=+152.442758206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.163107 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.173795 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-txsqk"] Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.174234 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.175888 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wsc22" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.177648 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.180044 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c2tx\" (UniqueName: \"kubernetes.io/projected/95906860-29b3-49a2-b903-37b9a5a808a5-kube-api-access-2c2tx\") pod \"collect-profiles-29526750-77rx8\" (UID: \"95906860-29b3-49a2-b903-37b9a5a808a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.184931 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbmjh\" (UniqueName: \"kubernetes.io/projected/47c55b18-d2e2-4b8a-a976-6955bc13425b-kube-api-access-pbmjh\") pod \"multus-admission-controller-857f4d67dd-6ct9m\" (UID: \"47c55b18-d2e2-4b8a-a976-6955bc13425b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6ct9m" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.194806 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.202932 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.206990 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc9hx\" (UniqueName: \"kubernetes.io/projected/eda7a295-93d7-4a47-915b-ba8b58bfb325-kube-api-access-vc9hx\") pod \"csi-hostpathplugin-5vp6h\" (UID: \"eda7a295-93d7-4a47-915b-ba8b58bfb325\") " pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.267277 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.267340 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.268148 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:04 crc kubenswrapper[4697]: E0220 16:34:04.268257 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:04.768233686 +0000 UTC m=+152.548279094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.268821 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:04 crc kubenswrapper[4697]: E0220 16:34:04.269328 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:04.769309018 +0000 UTC m=+152.549354426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.279404 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6b241e1-8a1e-4e61-b5ee-7b548b6301d9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f5mvc\" (UID: \"b6b241e1-8a1e-4e61-b5ee-7b548b6301d9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.302787 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt962\" (UniqueName: \"kubernetes.io/projected/234add05-4e79-4fd0-8412-1101f36476d0-kube-api-access-zt962\") pod \"olm-operator-6b444d44fb-7k8rk\" (UID: \"234add05-4e79-4fd0-8412-1101f36476d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.304818 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5g2w\" (UniqueName: \"kubernetes.io/projected/1e27abc7-011f-4e2f-bd37-98fc4ebc15dc-kube-api-access-p5g2w\") pod \"machine-config-operator-74547568cd-r4lhc\" (UID: \"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.305453 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4f7p\" (UniqueName: \"kubernetes.io/projected/2bcd3306-4d3d-484a-8aa6-cb90424cf206-kube-api-access-g4f7p\") pod \"service-ca-operator-777779d784-g9d6x\" (UID: \"2bcd3306-4d3d-484a-8aa6-cb90424cf206\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.318564 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9efcd507-8765-4845-8a65-3df282d78a69-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kz5km\" (UID: \"9efcd507-8765-4845-8a65-3df282d78a69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.349552 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58rbg\" (UniqueName: \"kubernetes.io/projected/c0d62d0c-e4d0-4316-ae51-929ffb7295ac-kube-api-access-58rbg\") pod \"dns-default-g58sg\" (UID: \"c0d62d0c-e4d0-4316-ae51-929ffb7295ac\") " pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.358281 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kwgmg"] Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.368002 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkz58\" (UniqueName: \"kubernetes.io/projected/b0966f26-8761-43ef-9d07-e6f0080885ec-kube-api-access-kkz58\") pod \"service-ca-9c57cc56f-lztm2\" (UID: \"b0966f26-8761-43ef-9d07-e6f0080885ec\") " pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.371989 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:04 crc kubenswrapper[4697]: E0220 16:34:04.372357 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:04.872343948 +0000 UTC m=+152.652389356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.376471 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdm99\" (UniqueName: \"kubernetes.io/projected/5bd2236d-2830-47cd-9eb8-9d9f07c821b6-kube-api-access-zdm99\") pod \"router-default-5444994796-rxpbh\" (UID: \"5bd2236d-2830-47cd-9eb8-9d9f07c821b6\") " pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.390140 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.396461 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbbhd\" (UniqueName: \"kubernetes.io/projected/5f5d4dca-8439-49b5-8996-ff1587bc2230-kube-api-access-hbbhd\") pod \"catalog-operator-68c6474976-6lsrp\" (UID: \"5f5d4dca-8439-49b5-8996-ff1587bc2230\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.400486 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ct9m" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.407669 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.415791 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.425679 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.439976 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl"] Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.443480 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.445004 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9spzs\" (UniqueName: \"kubernetes.io/projected/c4f55cb5-9ae7-40fe-8bc9-8278f67bd4bf-kube-api-access-9spzs\") pod \"migrator-59844c95c7-8r2q4\" (UID: \"c4f55cb5-9ae7-40fe-8bc9-8278f67bd4bf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8r2q4" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.447028 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdbwd\" (UniqueName: \"kubernetes.io/projected/0ae20fce-02da-42af-92de-7f3d280732df-kube-api-access-vdbwd\") pod \"machine-config-server-wljw5\" (UID: \"0ae20fce-02da-42af-92de-7f3d280732df\") " pod="openshift-machine-config-operator/machine-config-server-wljw5" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.450192 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vm4g\" (UniqueName: \"kubernetes.io/projected/382a81b1-8095-480d-9ead-e697f92dd700-kube-api-access-4vm4g\") pod \"ingress-canary-g6ltg\" (UID: \"382a81b1-8095-480d-9ead-e697f92dd700\") " pod="openshift-ingress-canary/ingress-canary-g6ltg" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.456281 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.462366 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.463101 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmc2f\" (UniqueName: \"kubernetes.io/projected/adf825e0-c430-439f-9d7c-55b7582f1b54-kube-api-access-gmc2f\") pod \"marketplace-operator-79b997595-m6zk6\" (UID: \"adf825e0-c430-439f-9d7c-55b7582f1b54\") " pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.473958 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.474479 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:04 crc kubenswrapper[4697]: E0220 16:34:04.475040 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:04.975025425 +0000 UTC m=+152.755070833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.495181 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.503103 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smxtt\" (UniqueName: \"kubernetes.io/projected/1dbb357b-a417-4661-b97c-272f1c7a5e0e-kube-api-access-smxtt\") pod \"packageserver-d55dfcdfc-xmjwk\" (UID: \"1dbb357b-a417-4661-b97c-272f1c7a5e0e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.503427 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.511039 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:04 crc kubenswrapper[4697]: W0220 16:34:04.522486 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bd2236d_2830_47cd_9eb8_9d9f07c821b6.slice/crio-ee3d064126ff4d359e6285b477ee7f328862961db08fa8f15d3839ff69f1d715 WatchSource:0}: Error finding container ee3d064126ff4d359e6285b477ee7f328862961db08fa8f15d3839ff69f1d715: Status 404 returned error can't find the container with id ee3d064126ff4d359e6285b477ee7f328862961db08fa8f15d3839ff69f1d715 Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.561272 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g6ltg" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.567754 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wljw5" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.575212 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:04 crc kubenswrapper[4697]: E0220 16:34:04.575574 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:05.075552188 +0000 UTC m=+152.855597596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.676493 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv"] Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.677595 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:04 crc kubenswrapper[4697]: E0220 16:34:04.677881 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:05.17786896 +0000 UTC m=+152.957914368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.681498 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8r2q4" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.731829 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.739395 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" event={"ID":"3e44b077-c323-48e8-be50-124f0a01d7d1","Type":"ContainerStarted","Data":"ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.739446 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" event={"ID":"3e44b077-c323-48e8-be50-124f0a01d7d1","Type":"ContainerStarted","Data":"e025121ed28f88c52cc3ed0323107b6147f50f30bcaf2f7564a0a4c47af309d7"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.739679 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.741021 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" event={"ID":"ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94","Type":"ContainerStarted","Data":"018cde41739b888147e2eb3e5ab0815f8965a976cc20b50acd54d3811308dccd"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.742370 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-txsqk" event={"ID":"70f9d3b5-82e4-47b2-ba65-88980dc9b401","Type":"ContainerStarted","Data":"eaedf32cafb61b90d2b084bed9bbf7097316783584742857d68ecaa704f3077f"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.744579 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-rxpbh" event={"ID":"5bd2236d-2830-47cd-9eb8-9d9f07c821b6","Type":"ContainerStarted","Data":"ee3d064126ff4d359e6285b477ee7f328862961db08fa8f15d3839ff69f1d715"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.746971 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t7pft" event={"ID":"314a7b24-36b9-41de-9ec3-3c229fc43b3d","Type":"ContainerStarted","Data":"83d8122b86a1914f9901993fc0e6acd99597d5a777569e791382da56c35cf7fb"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.746996 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t7pft" event={"ID":"314a7b24-36b9-41de-9ec3-3c229fc43b3d","Type":"ContainerStarted","Data":"87b50ec965aca0dd223c492823929de9837bf5309897fb8888193a37b8cc856a"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.747118 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-t7pft" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.748768 4697 patch_prober.go:28] interesting pod/downloads-7954f5f757-t7pft container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.748809 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t7pft" podUID="314a7b24-36b9-41de-9ec3-3c229fc43b3d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.749007 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.749676 4697 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qvhc2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.749707 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" podUID="3e44b077-c323-48e8-be50-124f0a01d7d1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.769013 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" event={"ID":"1b94e775-c465-4a54-858c-a3f82a6de290","Type":"ContainerStarted","Data":"e7bb6fb8455f0e48b29da9a32f5c7ef211d93c1b8eff862ec662849544be4e0c"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.772958 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" event={"ID":"5a156d77-af5f-4ff3-be63-279a07ea90f9","Type":"ContainerStarted","Data":"d9add876c5a9a6da67c855b041a53e5512eaafb36db21ec7f3b870654f231871"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.774063 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" event={"ID":"bf879bc2-82c1-4794-b3e6-5c7f3a238b47","Type":"ContainerStarted","Data":"fefbd2bfd4dee37225b72701d422683794e222f99e27dc224f9ac50f95902f50"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.774091 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" event={"ID":"bf879bc2-82c1-4794-b3e6-5c7f3a238b47","Type":"ContainerStarted","Data":"6225ddc3342a76264dfaa0417ad05e30dcefcb36eb8fc938c5a3a11a5257988d"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.775267 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" event={"ID":"4a4049af-a7dd-47b0-8dea-da1d662031c5","Type":"ContainerStarted","Data":"23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.775287 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" event={"ID":"4a4049af-a7dd-47b0-8dea-da1d662031c5","Type":"ContainerStarted","Data":"256c72c8cb5c2b8dbbb0245db2fa23324cf7bc3e4540c382d5aa24b23e7ae736"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.775615 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.776530 4697 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fp67j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.776583 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" podUID="4a4049af-a7dd-47b0-8dea-da1d662031c5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.777175 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" event={"ID":"9c9b32a5-4d7e-423d-bacb-38148c5a0e38","Type":"ContainerStarted","Data":"c4f0756a3820fc5561ebc379092acd5eb9865d6ca24beab9d5fbed687ac40bdb"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.777204 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" event={"ID":"9c9b32a5-4d7e-423d-bacb-38148c5a0e38","Type":"ContainerStarted","Data":"78fd3a6c1536e14e267de044e5d3d67adb23220b02125ae94649bd86b232d864"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.778335 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.778533 4697 generic.go:334] "Generic (PLEG): container finished" podID="09dc2933-a279-4e68-8587-dffed2d34a72" containerID="5ade1e2d6b178bdeffb6ba36f957826e01f1affd14a7253b871b9e8c1fe9fff0" exitCode=0 Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.778613 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" event={"ID":"09dc2933-a279-4e68-8587-dffed2d34a72","Type":"ContainerDied","Data":"5ade1e2d6b178bdeffb6ba36f957826e01f1affd14a7253b871b9e8c1fe9fff0"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.778637 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" event={"ID":"09dc2933-a279-4e68-8587-dffed2d34a72","Type":"ContainerStarted","Data":"e0092283baab714810b887db5b994607c07920406cd37ce33b94c5d6372e36a1"} Feb 20 16:34:04 crc kubenswrapper[4697]: E0220 16:34:04.779114 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:05.279095131 +0000 UTC m=+153.059140539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.782270 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" event={"ID":"3f719af6-047e-435e-9946-6869136164e3","Type":"ContainerStarted","Data":"6278b2fb03cade0903cc132804ad7466b62b6ba393e5679455b937d79d2a20d2"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.783993 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" event={"ID":"a5869f1e-9909-4326-a2be-9e363563c3d0","Type":"ContainerStarted","Data":"0b0f24fa59049314d9fbc02c817e1c6d74684cfc1dc4fb757037802cc0ee1b91"} Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.882872 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:04 crc kubenswrapper[4697]: E0220 16:34:04.883564 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:05.383552177 +0000 UTC m=+153.163597585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:04 crc kubenswrapper[4697]: I0220 16:34:04.995121 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:04 crc kubenswrapper[4697]: E0220 16:34:04.995557 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:05.495541946 +0000 UTC m=+153.275587354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.096877 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:05 crc kubenswrapper[4697]: E0220 16:34:05.097236 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:05.597220564 +0000 UTC m=+153.377265972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.199008 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:05 crc kubenswrapper[4697]: E0220 16:34:05.199515 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:05.699499315 +0000 UTC m=+153.479544723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.238865 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qj8gc" Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.300680 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:05 crc kubenswrapper[4697]: E0220 16:34:05.304444 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:05.804408459 +0000 UTC m=+153.584453857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.329721 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" podStartSLOduration=129.329699874 podStartE2EDuration="2m9.329699874s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:05.329701974 +0000 UTC m=+153.109747382" watchObservedRunningTime="2026-02-20 16:34:05.329699874 +0000 UTC m=+153.109745282" Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.408122 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:05 crc kubenswrapper[4697]: E0220 16:34:05.411983 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:05.908424638 +0000 UTC m=+153.688470046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.473558 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f6d4m"] Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.490134 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m8rl6"] Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.513277 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:05 crc kubenswrapper[4697]: E0220 16:34:05.514255 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:06.014242457 +0000 UTC m=+153.794287865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.522068 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" podStartSLOduration=130.522050421 podStartE2EDuration="2m10.522050421s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:05.511508571 +0000 UTC m=+153.291553979" watchObservedRunningTime="2026-02-20 16:34:05.522050421 +0000 UTC m=+153.302095829" Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.525407 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wsc22"] Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.545023 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb"] Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.546345 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d"] Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.583856 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt"] Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.604818 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cm59t"] Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.613805 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:05 crc kubenswrapper[4697]: E0220 16:34:05.614169 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:06.114153396 +0000 UTC m=+153.894198804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.621731 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6ct9m"] Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.716801 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:05 crc kubenswrapper[4697]: E0220 16:34:05.717092 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:06.217081213 +0000 UTC m=+153.997126621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.765203 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-t7pft" podStartSLOduration=130.765190925 podStartE2EDuration="2m10.765190925s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:05.763573552 +0000 UTC m=+153.543618950" watchObservedRunningTime="2026-02-20 16:34:05.765190925 +0000 UTC m=+153.545236333" Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.817619 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:05 crc kubenswrapper[4697]: E0220 16:34:05.818052 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:06.318031901 +0000 UTC m=+154.098077309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.921041 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:05 crc kubenswrapper[4697]: E0220 16:34:05.935803 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:06.435776565 +0000 UTC m=+154.215821963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.944471 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" event={"ID":"9c9b32a5-4d7e-423d-bacb-38148c5a0e38","Type":"ContainerStarted","Data":"4a342943f9ce5640dea5eaf284b6851a7cc3127f5fcde76641828df3d592e48e"} Feb 20 16:34:05 crc kubenswrapper[4697]: W0220 16:34:05.966719 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47c55b18_d2e2_4b8a_a976_6955bc13425b.slice/crio-990704a3ea292e5cd5f9f1dcf8df5d5181150d11c83e083e7ab4e723432d7094 WatchSource:0}: Error finding container 990704a3ea292e5cd5f9f1dcf8df5d5181150d11c83e083e7ab4e723432d7094: Status 404 returned error can't find the container with id 990704a3ea292e5cd5f9f1dcf8df5d5181150d11c83e083e7ab4e723432d7094 Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.987712 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" event={"ID":"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1","Type":"ContainerStarted","Data":"ccab84fd53d43f5e841c24bf6c685fa30e35b5456cf09ae5fdc72aabab1e9254"} Feb 20 16:34:05 crc kubenswrapper[4697]: I0220 16:34:05.987751 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.023027 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:06 crc kubenswrapper[4697]: E0220 16:34:06.023379 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:06.523355334 +0000 UTC m=+154.303400762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.035601 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" event={"ID":"3f719af6-047e-435e-9946-6869136164e3","Type":"ContainerStarted","Data":"d2190affb186bd0de3a142c8bf5dcbdd4a5eef6e8a16fe849e065ce050a771ca"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.043596 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.052642 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5vp6h"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.081502 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" event={"ID":"1b94e775-c465-4a54-858c-a3f82a6de290","Type":"ContainerStarted","Data":"816d0f04d7b16f3ae5d352616b01f91d01a814269831922c7a5410e601e50db6"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.083770 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" event={"ID":"ba2c3c8e-5fc7-4a41-8b38-41d5c2beaf94","Type":"ContainerStarted","Data":"ef292fba2a51b3e00272d42d76675588e6834cf24fa5caa02e2941809b1a19ca"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.097203 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wsc22" event={"ID":"0b089c24-376b-4007-a8f0-7ead066569db","Type":"ContainerStarted","Data":"e0cf0d9371f6d1c57409956ba75520c8d8ba9ed5759045ef206d33e509c6ff06"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.100224 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mqvww"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.118999 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.123705 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qj8gc" podStartSLOduration=131.123688479 podStartE2EDuration="2m11.123688479s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:06.10007693 +0000 UTC m=+153.880122338" watchObservedRunningTime="2026-02-20 16:34:06.123688479 +0000 UTC m=+153.903733897" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.125670 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:06 crc kubenswrapper[4697]: E0220 16:34:06.130035 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:06.630020546 +0000 UTC m=+154.410065954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.133722 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.140928 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.152083 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.154978 4697 generic.go:334] "Generic (PLEG): container finished" podID="5a156d77-af5f-4ff3-be63-279a07ea90f9" containerID="5f7b5bc317ad8a814fdb6df7a5e40a6120247b26c5af6038491e206f9adda167" exitCode=0 Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.155049 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" event={"ID":"5a156d77-af5f-4ff3-be63-279a07ea90f9","Type":"ContainerDied","Data":"5f7b5bc317ad8a814fdb6df7a5e40a6120247b26c5af6038491e206f9adda167"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.174223 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" event={"ID":"fdb22db1-79e1-4c5f-8853-ac893e454485","Type":"ContainerStarted","Data":"87685eeca8ab198c7922697aa1a4aae990bf5d93d1c2e1157ca4ef2d80fbaa2e"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.192042 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" event={"ID":"a5869f1e-9909-4326-a2be-9e363563c3d0","Type":"ContainerStarted","Data":"2bd209748c305d80007bb289782cabf592eb72c8107acdb05d6f1f715ecb9227"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.192083 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" event={"ID":"a5869f1e-9909-4326-a2be-9e363563c3d0","Type":"ContainerStarted","Data":"251e0a85faeb44bd599ec0e99ea2edb4ae21dbd700ee4de4f20bb20271b2d0b5"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.194540 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jgvqx" podStartSLOduration=130.194530037 podStartE2EDuration="2m10.194530037s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:06.193209355 +0000 UTC m=+153.973254763" watchObservedRunningTime="2026-02-20 16:34:06.194530037 +0000 UTC m=+153.974575445" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.227376 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.227506 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" event={"ID":"09dc2933-a279-4e68-8587-dffed2d34a72","Type":"ContainerStarted","Data":"48d23035b508f66235791a08577737faaa03606fcb38d91323f3aabefee38c1f"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.228011 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:06 crc kubenswrapper[4697]: W0220 16:34:06.228076 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod535cb2ff_6eeb_467d_b67b_ed6c1f3a51be.slice/crio-6e96c77402323ee4dcf0169beb111614bbca5c1bac77c0a2a95ae74edc14c06d WatchSource:0}: Error finding container 6e96c77402323ee4dcf0169beb111614bbca5c1bac77c0a2a95ae74edc14c06d: Status 404 returned error can't find the container with id 6e96c77402323ee4dcf0169beb111614bbca5c1bac77c0a2a95ae74edc14c06d Feb 20 16:34:06 crc kubenswrapper[4697]: E0220 16:34:06.228376 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:06.728362304 +0000 UTC m=+154.508407712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.235696 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5dstl" podStartSLOduration=130.235673058 podStartE2EDuration="2m10.235673058s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:06.211235327 +0000 UTC m=+153.991280735" watchObservedRunningTime="2026-02-20 16:34:06.235673058 +0000 UTC m=+154.015718466" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.237714 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" event={"ID":"d919170d-e25f-4a96-9503-edaf4c0c3c51","Type":"ContainerStarted","Data":"05fa5c0c04112d2b9169a578d002eecc1e60c8b81911fb2c46587ad365781dda"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.243375 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-rxpbh" event={"ID":"5bd2236d-2830-47cd-9eb8-9d9f07c821b6","Type":"ContainerStarted","Data":"737dd8abb96a81ee18954c4d7ce173d069c903d35266d9dacc8bb964b1e0c558"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.244512 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zp2wm" podStartSLOduration=131.244494692 podStartE2EDuration="2m11.244494692s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:06.239971246 +0000 UTC m=+154.020016654" watchObservedRunningTime="2026-02-20 16:34:06.244494692 +0000 UTC m=+154.024540100" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.304937 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" event={"ID":"4795bb15-2193-45c4-80fe-fcb0e99580ca","Type":"ContainerStarted","Data":"1f255e035cfc2079ed273503a4101eeebc8b3765c878a999dd5dcb25c160c45a"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.310552 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-txsqk" event={"ID":"70f9d3b5-82e4-47b2-ba65-88980dc9b401","Type":"ContainerStarted","Data":"56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.374652 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wljw5" event={"ID":"0ae20fce-02da-42af-92de-7f3d280732df","Type":"ContainerStarted","Data":"94701de81b3e818b2a3fd6da332df7fd2d495e7671018d4856cb256076a4e9c5"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.374973 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wljw5" event={"ID":"0ae20fce-02da-42af-92de-7f3d280732df","Type":"ContainerStarted","Data":"0b4da6276e0aabc1aa1bf869907c27180d6431ef0fc27dffe604834e6a922972"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.378969 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qm6jb" podStartSLOduration=130.378915454 podStartE2EDuration="2m10.378915454s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:06.366619426 +0000 UTC m=+154.146664834" watchObservedRunningTime="2026-02-20 16:34:06.378915454 +0000 UTC m=+154.158960862" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.380017 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5ctcg" podStartSLOduration=131.380008117 podStartE2EDuration="2m11.380008117s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:06.305350141 +0000 UTC m=+154.085395559" watchObservedRunningTime="2026-02-20 16:34:06.380008117 +0000 UTC m=+154.160053525" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.384131 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:06 crc kubenswrapper[4697]: E0220 16:34:06.385913 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:06.885899706 +0000 UTC m=+154.665945114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.390994 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.416768 4697 patch_prober.go:28] interesting pod/router-default-5444994796-rxpbh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 16:34:06 crc kubenswrapper[4697]: [-]has-synced failed: reason withheld Feb 20 16:34:06 crc kubenswrapper[4697]: [+]process-running ok Feb 20 16:34:06 crc kubenswrapper[4697]: healthz check failed Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.416898 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rxpbh" podUID="5bd2236d-2830-47cd-9eb8-9d9f07c821b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.424197 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.431588 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" event={"ID":"04900cf7-d92a-4918-92ee-827f0a68f48c","Type":"ContainerStarted","Data":"00152c85e8972ddbcccde4c082ade3079d371dc4e717145e7c6840e47290d8c6"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.431643 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" event={"ID":"04900cf7-d92a-4918-92ee-827f0a68f48c","Type":"ContainerStarted","Data":"d8f332d395e412030cfaa2899ddf6896504aff6e27cbdd22c4d4e44f34cc8671"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.431656 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" event={"ID":"04900cf7-d92a-4918-92ee-827f0a68f48c","Type":"ContainerStarted","Data":"d440ba902ab2adb0beeb03f8bb50bd7bf56e9e87351d9d79b303b0750de35eba"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.435959 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g6ltg"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.436125 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.511052 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" event={"ID":"ba0f29fa-2dd7-4725-8455-ff12c8c4f121","Type":"ContainerStarted","Data":"407687cf0e15b20953197bb49ff7ebf82184ebeca8ecb64f8a2cec2dfc789ba6"} Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.518012 4697 patch_prober.go:28] interesting pod/downloads-7954f5f757-t7pft container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.518082 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t7pft" podUID="314a7b24-36b9-41de-9ec3-3c229fc43b3d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.519166 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lztm2"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.519194 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.522059 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:06 crc kubenswrapper[4697]: E0220 16:34:06.522210 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:07.022196211 +0000 UTC m=+154.802241609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.525030 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:06 crc kubenswrapper[4697]: E0220 16:34:06.526951 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:07.026934156 +0000 UTC m=+154.806979564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.528824 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.530947 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.550056 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.561210 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-kwgmg" podStartSLOduration=130.561193129 podStartE2EDuration="2m10.561193129s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:06.509492327 +0000 UTC m=+154.289537735" watchObservedRunningTime="2026-02-20 16:34:06.561193129 +0000 UTC m=+154.341238537" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.567502 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m6zk6"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.567554 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8r2q4"] Feb 20 16:34:06 crc kubenswrapper[4697]: W0220 16:34:06.624402 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4f55cb5_9ae7_40fe_8bc9_8278f67bd4bf.slice/crio-bdddced9a940a69acc45ae62ad42a5d8443fe9ea55848b40e708b9f1a818e9dc WatchSource:0}: Error finding container bdddced9a940a69acc45ae62ad42a5d8443fe9ea55848b40e708b9f1a818e9dc: Status 404 returned error can't find the container with id bdddced9a940a69acc45ae62ad42a5d8443fe9ea55848b40e708b9f1a818e9dc Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.629354 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:06 crc kubenswrapper[4697]: E0220 16:34:06.631053 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:07.131033888 +0000 UTC m=+154.911079306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.638496 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g58sg"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.663911 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk"] Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.687783 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" podStartSLOduration=131.687759586 podStartE2EDuration="2m11.687759586s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:06.610851772 +0000 UTC m=+154.390897200" watchObservedRunningTime="2026-02-20 16:34:06.687759586 +0000 UTC m=+154.467804994" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.688928 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-txsqk" podStartSLOduration=131.688921751 podStartE2EDuration="2m11.688921751s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:06.654417228 +0000 UTC m=+154.434462636" watchObservedRunningTime="2026-02-20 16:34:06.688921751 +0000 UTC m=+154.468967159" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.703105 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-wljw5" podStartSLOduration=5.703091423 podStartE2EDuration="5.703091423s" podCreationTimestamp="2026-02-20 16:34:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:06.681905008 +0000 UTC m=+154.461950416" watchObservedRunningTime="2026-02-20 16:34:06.703091423 +0000 UTC m=+154.483136831" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.741125 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:06 crc kubenswrapper[4697]: E0220 16:34:06.741486 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:07.241473677 +0000 UTC m=+155.021519075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.843075 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:06 crc kubenswrapper[4697]: E0220 16:34:06.843682 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:07.343666685 +0000 UTC m=+155.123712093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.874727 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-rxpbh" podStartSLOduration=130.874714293 podStartE2EDuration="2m10.874714293s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:06.873806888 +0000 UTC m=+154.653852296" watchObservedRunningTime="2026-02-20 16:34:06.874714293 +0000 UTC m=+154.654759701" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.928951 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" podStartSLOduration=130.928929084 podStartE2EDuration="2m10.928929084s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:06.91803148 +0000 UTC m=+154.698076878" watchObservedRunningTime="2026-02-20 16:34:06.928929084 +0000 UTC m=+154.708974492" Feb 20 16:34:06 crc kubenswrapper[4697]: I0220 16:34:06.975624 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:06 crc kubenswrapper[4697]: E0220 16:34:06.976034 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:07.476021057 +0000 UTC m=+155.256066465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.078044 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:07 crc kubenswrapper[4697]: E0220 16:34:07.078534 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:07.578514507 +0000 UTC m=+155.358559925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.180280 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:07 crc kubenswrapper[4697]: E0220 16:34:07.180786 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:07.680773767 +0000 UTC m=+155.460819175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.210874 4697 csr.go:261] certificate signing request csr-rmk4g is approved, waiting to be issued Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.210911 4697 csr.go:257] certificate signing request csr-rmk4g is issued Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.281362 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:07 crc kubenswrapper[4697]: E0220 16:34:07.281670 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:07.781628783 +0000 UTC m=+155.561674201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.282150 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:07 crc kubenswrapper[4697]: E0220 16:34:07.282569 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:07.782554719 +0000 UTC m=+155.562600127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.383188 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:07 crc kubenswrapper[4697]: E0220 16:34:07.383615 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:07.883597992 +0000 UTC m=+155.663643400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.401839 4697 patch_prober.go:28] interesting pod/router-default-5444994796-rxpbh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 16:34:07 crc kubenswrapper[4697]: [-]has-synced failed: reason withheld Feb 20 16:34:07 crc kubenswrapper[4697]: [+]process-running ok Feb 20 16:34:07 crc kubenswrapper[4697]: healthz check failed Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.401906 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rxpbh" podUID="5bd2236d-2830-47cd-9eb8-9d9f07c821b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.485087 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:07 crc kubenswrapper[4697]: E0220 16:34:07.485982 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:07.985965867 +0000 UTC m=+155.766011275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.587703 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:07 crc kubenswrapper[4697]: E0220 16:34:07.588030 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:08.088015159 +0000 UTC m=+155.868060567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.588659 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g58sg" event={"ID":"c0d62d0c-e4d0-4316-ae51-929ffb7295ac","Type":"ContainerStarted","Data":"0efe4dcba2ad1ac29f3e035f323d35d0b6c490f18a1670ccc0a9a072f41a4a1a"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.603627 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ct9m" event={"ID":"47c55b18-d2e2-4b8a-a976-6955bc13425b","Type":"ContainerStarted","Data":"62697e20430083b284859fed04634fe3ec38296eae0164b9e8dbf366a19a24f0"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.603675 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ct9m" event={"ID":"47c55b18-d2e2-4b8a-a976-6955bc13425b","Type":"ContainerStarted","Data":"990704a3ea292e5cd5f9f1dcf8df5d5181150d11c83e083e7ab4e723432d7094"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.625594 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" event={"ID":"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc","Type":"ContainerStarted","Data":"9b37fded6a4063a2e97d6f1776f6235d91be553f3170bb77fd0d36b04b208de0"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.625754 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" event={"ID":"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc","Type":"ContainerStarted","Data":"0e5bab8c76bdd9af81557c9f97cbc76fe7973ea15cfc7644cb69846b6788a8b4"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.633714 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" event={"ID":"d919170d-e25f-4a96-9503-edaf4c0c3c51","Type":"ContainerStarted","Data":"028a8be4b22ef5e98577ff42d1b22e53ccbd46a0fdf70a507102159c01a427ec"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.634069 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.638842 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g6ltg" event={"ID":"382a81b1-8095-480d-9ead-e697f92dd700","Type":"ContainerStarted","Data":"d71bc1cefa20565a5ab1fb875dd9e6b1f0bb48ccd94fce2b52ba099b10917821"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.679898 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" event={"ID":"eda7a295-93d7-4a47-915b-ba8b58bfb325","Type":"ContainerStarted","Data":"1a305d28bd41252dc219d72a16b2ea0697a26abc724e31e252a3294dc6f2a07f"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.689497 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:07 crc kubenswrapper[4697]: E0220 16:34:07.691566 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:08.191555009 +0000 UTC m=+155.971600417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.705802 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mqvww" event={"ID":"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be","Type":"ContainerStarted","Data":"6e96c77402323ee4dcf0169beb111614bbca5c1bac77c0a2a95ae74edc14c06d"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.740467 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" event={"ID":"1dbb357b-a417-4661-b97c-272f1c7a5e0e","Type":"ContainerStarted","Data":"5330a8c3f9b2acd34c943ce7e8866679225ee233bebba1ab47ebc93cb251b0e7"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.776083 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" event={"ID":"ba0f29fa-2dd7-4725-8455-ff12c8c4f121","Type":"ContainerStarted","Data":"7e49a29cd577b26fa98f7ceabc86e198583d1904d5e522bfbcc8aadfe51af934"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.790517 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:07 crc kubenswrapper[4697]: E0220 16:34:07.791326 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:08.291313283 +0000 UTC m=+156.071358691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.798481 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" event={"ID":"234add05-4e79-4fd0-8412-1101f36476d0","Type":"ContainerStarted","Data":"602f37f5a19725ee57c9154ba41cbcbea05cae0961634acc743a28a95dc2be2e"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.798525 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" event={"ID":"234add05-4e79-4fd0-8412-1101f36476d0","Type":"ContainerStarted","Data":"a0bd0fe4e9936d19ee1bdb73c3a08f4dc114f6a6530967bd205e180420f23191"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.799007 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.800084 4697 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7k8rk container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.800121 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" podUID="234add05-4e79-4fd0-8412-1101f36476d0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.800660 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" event={"ID":"b0966f26-8761-43ef-9d07-e6f0080885ec","Type":"ContainerStarted","Data":"64e0413e768bf3845c9e6126e082dcb71f0360713b2b7bb70e44dceb20705bb5"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.803329 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" event={"ID":"4795bb15-2193-45c4-80fe-fcb0e99580ca","Type":"ContainerStarted","Data":"23275c14245160d482defc507c46242e5c7f4a0d119fbfc3f4cdbed3124f85bf"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.812828 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" podStartSLOduration=132.812811779 podStartE2EDuration="2m12.812811779s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:07.811868483 +0000 UTC m=+155.591913891" watchObservedRunningTime="2026-02-20 16:34:07.812811779 +0000 UTC m=+155.592857187" Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.848163 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" event={"ID":"b6b241e1-8a1e-4e61-b5ee-7b548b6301d9","Type":"ContainerStarted","Data":"4964671034aba6add7e3577be22e957daf0022dd06c4b2a1286e6c42b4f0c3f1"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.848217 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" event={"ID":"b6b241e1-8a1e-4e61-b5ee-7b548b6301d9","Type":"ContainerStarted","Data":"84225be6f9c6de866bb020d989455c21885c7dfaf5058fae3400f3685666b245"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.870822 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" event={"ID":"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1","Type":"ContainerStarted","Data":"d2617a3ab2ce3b55ea9448faeb6cff542ef806d7b37c7ecc3cdafdcbb1fdfcc7"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.892264 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:07 crc kubenswrapper[4697]: E0220 16:34:07.892576 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:08.392564934 +0000 UTC m=+156.172610342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.905390 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8r2q4" event={"ID":"c4f55cb5-9ae7-40fe-8bc9-8278f67bd4bf","Type":"ContainerStarted","Data":"bdddced9a940a69acc45ae62ad42a5d8443fe9ea55848b40e708b9f1a818e9dc"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.937863 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f5mvc" podStartSLOduration=131.937848417 podStartE2EDuration="2m11.937848417s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:07.935962153 +0000 UTC m=+155.716007561" watchObservedRunningTime="2026-02-20 16:34:07.937848417 +0000 UTC m=+155.717893815" Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.952803 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wsc22" event={"ID":"0b089c24-376b-4007-a8f0-7ead066569db","Type":"ContainerStarted","Data":"eb45f67e0237b5fe1933710f3026f4587311cae04f4d5eb774d880a9b835dfd0"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.968677 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" event={"ID":"adf825e0-c430-439f-9d7c-55b7582f1b54","Type":"ContainerStarted","Data":"07441c3fbdabb91968fc2535f0a204ecf1c420b6c747e2e87ae34f0f2a0bcdcc"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.968726 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" event={"ID":"adf825e0-c430-439f-9d7c-55b7582f1b54","Type":"ContainerStarted","Data":"7cb57bc510ddd5d77bf06f19139ca8d9fd707ad6b0dbb96dfe7d73814ff63069"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.969568 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.985513 4697 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m6zk6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.985583 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" podUID="adf825e0-c430-439f-9d7c-55b7582f1b54" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.993145 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:07 crc kubenswrapper[4697]: E0220 16:34:07.994176 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:08.494157329 +0000 UTC m=+156.274202737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.996000 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt" event={"ID":"8912bf5c-045e-4c86-9a09-41c4cab10139","Type":"ContainerStarted","Data":"fd94cdad1058814fbf416e3a1ff78cde509701bb179c6af1cb7a9f005687b11d"} Feb 20 16:34:07 crc kubenswrapper[4697]: I0220 16:34:07.996041 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt" event={"ID":"8912bf5c-045e-4c86-9a09-41c4cab10139","Type":"ContainerStarted","Data":"1d3ced031126aa6c725ed404de678f673cd020cdda17542925eef9fc4fdc690b"} Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.022134 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" event={"ID":"5a156d77-af5f-4ff3-be63-279a07ea90f9","Type":"ContainerStarted","Data":"8aba7f3b39b231c3e003b6cdc7932f4a2f78936b6e3a938ba817f56b3e2426f5"} Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.040160 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q" event={"ID":"734eb893-5f04-4ae9-b45a-cd1ff030a1e8","Type":"ContainerStarted","Data":"0a961d52e2d7bdc296529d559a4871a3bf10938fc0d5674d706d5c00b8d3c40b"} Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.040533 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q" event={"ID":"734eb893-5f04-4ae9-b45a-cd1ff030a1e8","Type":"ContainerStarted","Data":"0a0e563debf1177acac44827c0ffead70377bb6dce6b5eb1cea0a92dfb237d76"} Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.053605 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" event={"ID":"2bcd3306-4d3d-484a-8aa6-cb90424cf206","Type":"ContainerStarted","Data":"069acc12ca608881c7466d1baa7bdf95d25ffccb01408eb3397e02b4212b3583"} Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.072121 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" event={"ID":"9efcd507-8765-4845-8a65-3df282d78a69","Type":"ContainerStarted","Data":"efe2adfc78cc332b4e3407f066b819a7a8978fe8efa04f0adb5243ac2f9d4af9"} Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.089222 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" event={"ID":"5f5d4dca-8439-49b5-8996-ff1587bc2230","Type":"ContainerStarted","Data":"bc7c22054f4200e7e7e7eda5cf428ff3c4c54bdc6c7c9220eea8eb1590e05cfc"} Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.089278 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" event={"ID":"5f5d4dca-8439-49b5-8996-ff1587bc2230","Type":"ContainerStarted","Data":"2e9013b07a94cf6df5713a8c7713ac48d556a0a0afae4e218a679268fedc5054"} Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.091591 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.094237 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" podStartSLOduration=132.094221113 podStartE2EDuration="2m12.094221113s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:08.0299127 +0000 UTC m=+155.809958108" watchObservedRunningTime="2026-02-20 16:34:08.094221113 +0000 UTC m=+155.874266521" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.094609 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-cm59t" podStartSLOduration=133.094602768 podStartE2EDuration="2m13.094602768s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:08.091465006 +0000 UTC m=+155.871510414" watchObservedRunningTime="2026-02-20 16:34:08.094602768 +0000 UTC m=+155.874648186" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.096271 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.097922 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:08.597908187 +0000 UTC m=+156.377953595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.103960 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.113477 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" event={"ID":"95906860-29b3-49a2-b903-37b9a5a808a5","Type":"ContainerStarted","Data":"4464255c5a1c57a8e2a67aaf79c6b738fc5cacdb33c0ae9c5950c814b4acf82a"} Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.113518 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" event={"ID":"95906860-29b3-49a2-b903-37b9a5a808a5","Type":"ContainerStarted","Data":"7a53567664aca3d284841e81dbcc2728c226ac72c140554e4350c42b226656b5"} Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.118180 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6lsrp" podStartSLOduration=132.118151925 podStartE2EDuration="2m12.118151925s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:08.115770212 +0000 UTC m=+155.895815630" watchObservedRunningTime="2026-02-20 16:34:08.118151925 +0000 UTC m=+155.898197333" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.125396 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" event={"ID":"fdb22db1-79e1-4c5f-8853-ac893e454485","Type":"ContainerStarted","Data":"3b5801b5229f956713c5cbfcf1167f4b4d4bc9aa0a59c245904f27cf53707a22"} Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.129351 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" event={"ID":"7dedc6ab-dd63-42b7-a4d8-35f9d867a546","Type":"ContainerStarted","Data":"bd87ee90931d0a19ebe4bba0bb8ad3f0b7e10dbca37983b5498a5f29ba0820b0"} Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.167113 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hxqpv" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.167626 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" podStartSLOduration=132.16761398 podStartE2EDuration="2m12.16761398s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:08.166688434 +0000 UTC m=+155.946733842" watchObservedRunningTime="2026-02-20 16:34:08.16761398 +0000 UTC m=+155.947659388" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.186303 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dkznt" podStartSLOduration=132.186289377 podStartE2EDuration="2m12.186289377s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:08.185527158 +0000 UTC m=+155.965572566" watchObservedRunningTime="2026-02-20 16:34:08.186289377 +0000 UTC m=+155.966334785" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.196768 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.198402 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:08.698387598 +0000 UTC m=+156.478433006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.212626 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-20 16:29:07 +0000 UTC, rotation deadline is 2027-01-08 07:25:33.007397465 +0000 UTC Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.212654 4697 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7718h51m24.794745551s for next certificate rotation Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.233259 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" podStartSLOduration=132.233221414 podStartE2EDuration="2m12.233221414s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:08.209541162 +0000 UTC m=+155.989586570" watchObservedRunningTime="2026-02-20 16:34:08.233221414 +0000 UTC m=+156.013266822" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.233637 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" podStartSLOduration=133.23363259 podStartE2EDuration="2m13.23363259s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:08.229309312 +0000 UTC m=+156.009354740" watchObservedRunningTime="2026-02-20 16:34:08.23363259 +0000 UTC m=+156.013677998" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.293238 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qk7vb" podStartSLOduration=133.29321776 podStartE2EDuration="2m13.29321776s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:08.263860747 +0000 UTC m=+156.043906155" watchObservedRunningTime="2026-02-20 16:34:08.29321776 +0000 UTC m=+156.073263168" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.312227 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.312618 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:08.812605824 +0000 UTC m=+156.592651232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.399421 4697 patch_prober.go:28] interesting pod/router-default-5444994796-rxpbh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 16:34:08 crc kubenswrapper[4697]: [-]has-synced failed: reason withheld Feb 20 16:34:08 crc kubenswrapper[4697]: [+]process-running ok Feb 20 16:34:08 crc kubenswrapper[4697]: healthz check failed Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.400092 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rxpbh" podUID="5bd2236d-2830-47cd-9eb8-9d9f07c821b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.420778 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.421150 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:08.921127719 +0000 UTC m=+156.701173127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.485234 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.485584 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.496992 4697 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-nct46 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.497047 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" podUID="5a156d77-af5f-4ff3-be63-279a07ea90f9" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.522273 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.522634 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.022622339 +0000 UTC m=+156.802667747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.623181 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.623498 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.123461135 +0000 UTC m=+156.903506543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.623873 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.624265 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.124250145 +0000 UTC m=+156.904295553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.634326 4697 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-m8rl6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.634375 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.725586 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.725751 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.225726345 +0000 UTC m=+157.005771753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.726073 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.726463 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.226451854 +0000 UTC m=+157.006497262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.826998 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.827227 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.327197215 +0000 UTC m=+157.107242623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.827605 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.827878 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.327870431 +0000 UTC m=+157.107915839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.929141 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.929342 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.42930808 +0000 UTC m=+157.209353488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:08 crc kubenswrapper[4697]: I0220 16:34:08.929592 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:08 crc kubenswrapper[4697]: E0220 16:34:08.929968 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.429950895 +0000 UTC m=+157.209996303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.030313 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:09 crc kubenswrapper[4697]: E0220 16:34:09.030454 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.530418366 +0000 UTC m=+157.310463764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.030573 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:09 crc kubenswrapper[4697]: E0220 16:34:09.030915 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.530903175 +0000 UTC m=+157.310948573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.131257 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:09 crc kubenswrapper[4697]: E0220 16:34:09.131489 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.631461079 +0000 UTC m=+157.411506487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.135071 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wsc22" event={"ID":"0b089c24-376b-4007-a8f0-7ead066569db","Type":"ContainerStarted","Data":"4fe051f48bfae24ce3a17a50e70ec08f339ba2897371147d153dc6ec3a6065aa"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.136300 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" event={"ID":"1dbb357b-a417-4661-b97c-272f1c7a5e0e","Type":"ContainerStarted","Data":"bb1e6a57a6929bc2df81fd42b3a71d124065c6cd7c7f61c3b5506be204d307e5"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.136550 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.137531 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" event={"ID":"9efcd507-8765-4845-8a65-3df282d78a69","Type":"ContainerStarted","Data":"e336f1c843468ddd47fb7bf24436f6d3adccb8137f335df23e18b3227d260c8d"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.138257 4697 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xmjwk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.138312 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" podUID="1dbb357b-a417-4661-b97c-272f1c7a5e0e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.140140 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" event={"ID":"ca43f4e8-ab22-4573-b5cb-5d58dbf788f1","Type":"ContainerStarted","Data":"439e4e73cd1dd028d13116bc9aa81c2117990955b1152b6cad10ca202dc92474"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.141732 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8r2q4" event={"ID":"c4f55cb5-9ae7-40fe-8bc9-8278f67bd4bf","Type":"ContainerStarted","Data":"cbbb9509e6f79380158d354fe47a579a12c215fcbd5828eed0cd5772e46ee3ee"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.141798 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8r2q4" event={"ID":"c4f55cb5-9ae7-40fe-8bc9-8278f67bd4bf","Type":"ContainerStarted","Data":"5485971c5ade5eb97dc05d1e7508853698d9492235b35d30aad7cfaca01fbc7f"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.142745 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g6ltg" event={"ID":"382a81b1-8095-480d-9ead-e697f92dd700","Type":"ContainerStarted","Data":"6daf2316733c846dd6869ce10ad511c5b4e033394b45971b7c222e9203eb9522"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.144658 4697 generic.go:334] "Generic (PLEG): container finished" podID="535cb2ff-6eeb-467d-b67b-ed6c1f3a51be" containerID="fa40246bfdf1ec6f2e04bd7ab4715df3f8922483b76434c23a9da442d2741c76" exitCode=0 Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.144720 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mqvww" event={"ID":"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be","Type":"ContainerStarted","Data":"f1f834125270f05410b5c592c3e588eef0043750a71a5f526ecec3d0a3ada60a"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.144737 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mqvww" event={"ID":"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be","Type":"ContainerStarted","Data":"669c93f96db2772216c9eb94ffc17d52f9821195fdd3a1412fd844904e67d551"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.144747 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mqvww" event={"ID":"535cb2ff-6eeb-467d-b67b-ed6c1f3a51be","Type":"ContainerDied","Data":"fa40246bfdf1ec6f2e04bd7ab4715df3f8922483b76434c23a9da442d2741c76"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.146264 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" event={"ID":"ba0f29fa-2dd7-4725-8455-ff12c8c4f121","Type":"ContainerStarted","Data":"75e34a828f7bd9fb7b128218ef9f82c62b280f4e4be22d4c066cfff4bae5d12e"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.147562 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ct9m" event={"ID":"47c55b18-d2e2-4b8a-a976-6955bc13425b","Type":"ContainerStarted","Data":"df6ee251b92e170dd41346782fbbcd190d87a7c5da50a39b2c6a5187128e76e8"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.149171 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" event={"ID":"b0966f26-8761-43ef-9d07-e6f0080885ec","Type":"ContainerStarted","Data":"3facb73e68ba2151522e096eaaedc2d5226e1cea8ab4941c3134de052c967887"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.150507 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" event={"ID":"1e27abc7-011f-4e2f-bd37-98fc4ebc15dc","Type":"ContainerStarted","Data":"7b0610bb60c4890d02dc76071514233928b3082ad975c9456049838e7668e255"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.151589 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" event={"ID":"7dedc6ab-dd63-42b7-a4d8-35f9d867a546","Type":"ContainerStarted","Data":"b545cc40c489840433bb83453bca3407618c1bf1b2960d9dc98d875f8b2f1b63"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.153026 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" event={"ID":"eda7a295-93d7-4a47-915b-ba8b58bfb325","Type":"ContainerStarted","Data":"154d2ad5e1c4fce071d8acb20e2cf88eaf9cc791fd50b8329ca277e42408eb90"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.154043 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" event={"ID":"2bcd3306-4d3d-484a-8aa6-cb90424cf206","Type":"ContainerStarted","Data":"90bcc3900e6e9a6761ddcff4b3c840fa5d5f6157ddb8e95aae30d7a81e555f10"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.155333 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q" event={"ID":"734eb893-5f04-4ae9-b45a-cd1ff030a1e8","Type":"ContainerStarted","Data":"c7883b6a64bae709e4aba0da9c0ad6bea9656a989a55e170c6dc9f9487fab3b9"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.156837 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g58sg" event={"ID":"c0d62d0c-e4d0-4316-ae51-929ffb7295ac","Type":"ContainerStarted","Data":"5471467105aee3ccd3b85e0e613e03dd372cbe2f140febf4b4ca7c53a5e1154c"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.156878 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g58sg" event={"ID":"c0d62d0c-e4d0-4316-ae51-929ffb7295ac","Type":"ContainerStarted","Data":"2f70c53b38574c20894fabf8d9bdf4f883f14467762b9fb5bf4e61f4fe67df9a"} Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.157844 4697 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m6zk6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.157885 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" podUID="adf825e0-c430-439f-9d7c-55b7582f1b54" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.163667 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.167547 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7k8rk" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.213809 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-f6d4m" podStartSLOduration=133.213792584 podStartE2EDuration="2m13.213792584s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.211760195 +0000 UTC m=+156.991805603" watchObservedRunningTime="2026-02-20 16:34:09.213792584 +0000 UTC m=+156.993837992" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.214301 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-wsc22" podStartSLOduration=134.214295343 podStartE2EDuration="2m14.214295343s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.161060771 +0000 UTC m=+156.941106169" watchObservedRunningTime="2026-02-20 16:34:09.214295343 +0000 UTC m=+156.994340751" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.239089 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:09 crc kubenswrapper[4697]: E0220 16:34:09.242557 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.742539363 +0000 UTC m=+157.522584771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.268097 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.268463 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8r2q4" podStartSLOduration=133.26842699 podStartE2EDuration="2m13.26842699s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.238588879 +0000 UTC m=+157.018634287" watchObservedRunningTime="2026-02-20 16:34:09.26842699 +0000 UTC m=+157.048472398" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.268558 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.270718 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-g6ltg" podStartSLOduration=8.270710109 podStartE2EDuration="8.270710109s" podCreationTimestamp="2026-02-20 16:34:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.268332397 +0000 UTC m=+157.048377805" watchObservedRunningTime="2026-02-20 16:34:09.270710109 +0000 UTC m=+157.050755517" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.272256 4697 patch_prober.go:28] interesting pod/apiserver-76f77b778f-mqvww container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.20:8443/livez\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.272385 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-mqvww" podUID="535cb2ff-6eeb-467d-b67b-ed6c1f3a51be" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.20:8443/livez\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.307178 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g58sg" podStartSLOduration=8.307155118 podStartE2EDuration="8.307155118s" podCreationTimestamp="2026-02-20 16:34:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.306028464 +0000 UTC m=+157.086073872" watchObservedRunningTime="2026-02-20 16:34:09.307155118 +0000 UTC m=+157.087200526" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.327142 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r4lhc" podStartSLOduration=133.327123845 podStartE2EDuration="2m13.327123845s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.32621862 +0000 UTC m=+157.106264028" watchObservedRunningTime="2026-02-20 16:34:09.327123845 +0000 UTC m=+157.107169253" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.340681 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:09 crc kubenswrapper[4697]: E0220 16:34:09.340883 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.84085142 +0000 UTC m=+157.620896818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.341323 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:09 crc kubenswrapper[4697]: E0220 16:34:09.341702 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.841692722 +0000 UTC m=+157.621738130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.398422 4697 patch_prober.go:28] interesting pod/router-default-5444994796-rxpbh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 16:34:09 crc kubenswrapper[4697]: [-]has-synced failed: reason withheld Feb 20 16:34:09 crc kubenswrapper[4697]: [+]process-running ok Feb 20 16:34:09 crc kubenswrapper[4697]: healthz check failed Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.398754 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rxpbh" podUID="5bd2236d-2830-47cd-9eb8-9d9f07c821b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.399363 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-d5xj5" podStartSLOduration=133.399343765 podStartE2EDuration="2m13.399343765s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.368763316 +0000 UTC m=+157.148808724" watchObservedRunningTime="2026-02-20 16:34:09.399343765 +0000 UTC m=+157.179389173" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.401697 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q4k9q" podStartSLOduration=134.401685887 podStartE2EDuration="2m14.401685887s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.399275653 +0000 UTC m=+157.179321061" watchObservedRunningTime="2026-02-20 16:34:09.401685887 +0000 UTC m=+157.181731295" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.442550 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:09 crc kubenswrapper[4697]: E0220 16:34:09.443302 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:09.943275576 +0000 UTC m=+157.723320994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.504609 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-mqvww" podStartSLOduration=134.504589672 podStartE2EDuration="2m14.504589672s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.475693497 +0000 UTC m=+157.255738915" watchObservedRunningTime="2026-02-20 16:34:09.504589672 +0000 UTC m=+157.284635080" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.505192 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6ct9m" podStartSLOduration=133.505186785 podStartE2EDuration="2m13.505186785s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.504376834 +0000 UTC m=+157.284422252" watchObservedRunningTime="2026-02-20 16:34:09.505186785 +0000 UTC m=+157.285232193" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.525513 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-lztm2" podStartSLOduration=133.525492776 podStartE2EDuration="2m13.525492776s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.524939804 +0000 UTC m=+157.304985212" watchObservedRunningTime="2026-02-20 16:34:09.525492776 +0000 UTC m=+157.305538184" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.553051 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:09 crc kubenswrapper[4697]: E0220 16:34:09.553476 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.053459955 +0000 UTC m=+157.833505363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.563632 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kz5km" podStartSLOduration=133.56361435 podStartE2EDuration="2m13.56361435s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.555203852 +0000 UTC m=+157.335249270" watchObservedRunningTime="2026-02-20 16:34:09.56361435 +0000 UTC m=+157.343659748" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.574918 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" podStartSLOduration=133.574903149 podStartE2EDuration="2m13.574903149s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.574561586 +0000 UTC m=+157.354606994" watchObservedRunningTime="2026-02-20 16:34:09.574903149 +0000 UTC m=+157.354948557" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.600736 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gr64d" podStartSLOduration=133.600719604 podStartE2EDuration="2m13.600719604s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.597832542 +0000 UTC m=+157.377877950" watchObservedRunningTime="2026-02-20 16:34:09.600719604 +0000 UTC m=+157.380765012" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.624298 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g9d6x" podStartSLOduration=133.624282401 podStartE2EDuration="2m13.624282401s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:09.621094907 +0000 UTC m=+157.401140305" watchObservedRunningTime="2026-02-20 16:34:09.624282401 +0000 UTC m=+157.404327809" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.653901 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:09 crc kubenswrapper[4697]: E0220 16:34:09.654243 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.154229717 +0000 UTC m=+157.934275125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.754766 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:09 crc kubenswrapper[4697]: E0220 16:34:09.755086 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.255075623 +0000 UTC m=+158.035121031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.807596 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7b742"] Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.808590 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7b742" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.810172 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.833505 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7b742"] Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.855814 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.856038 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxx9k\" (UniqueName: \"kubernetes.io/projected/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-kube-api-access-mxx9k\") pod \"community-operators-7b742\" (UID: \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\") " pod="openshift-marketplace/community-operators-7b742" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.856112 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-utilities\") pod \"community-operators-7b742\" (UID: \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\") " pod="openshift-marketplace/community-operators-7b742" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.856136 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-catalog-content\") pod \"community-operators-7b742\" (UID: \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\") " pod="openshift-marketplace/community-operators-7b742" Feb 20 16:34:09 crc kubenswrapper[4697]: E0220 16:34:09.856267 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.356252181 +0000 UTC m=+158.136297589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.957519 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.957564 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxx9k\" (UniqueName: \"kubernetes.io/projected/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-kube-api-access-mxx9k\") pod \"community-operators-7b742\" (UID: \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\") " pod="openshift-marketplace/community-operators-7b742" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.957613 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-utilities\") pod \"community-operators-7b742\" (UID: \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\") " pod="openshift-marketplace/community-operators-7b742" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.957640 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-catalog-content\") pod \"community-operators-7b742\" (UID: \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\") " pod="openshift-marketplace/community-operators-7b742" Feb 20 16:34:09 crc kubenswrapper[4697]: E0220 16:34:09.957895 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.457879887 +0000 UTC m=+158.237925285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.958034 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-catalog-content\") pod \"community-operators-7b742\" (UID: \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\") " pod="openshift-marketplace/community-operators-7b742" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.958290 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-utilities\") pod \"community-operators-7b742\" (UID: \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\") " pod="openshift-marketplace/community-operators-7b742" Feb 20 16:34:09 crc kubenswrapper[4697]: I0220 16:34:09.985135 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxx9k\" (UniqueName: \"kubernetes.io/projected/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-kube-api-access-mxx9k\") pod \"community-operators-7b742\" (UID: \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\") " pod="openshift-marketplace/community-operators-7b742" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.012198 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qjcxf"] Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.013496 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.023049 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.039675 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjcxf"] Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.058742 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.058899 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.558874278 +0000 UTC m=+158.338919686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.058962 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pfjk\" (UniqueName: \"kubernetes.io/projected/7e335707-03f3-4791-85c7-95134150dc71-kube-api-access-9pfjk\") pod \"certified-operators-qjcxf\" (UID: \"7e335707-03f3-4791-85c7-95134150dc71\") " pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.059188 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e335707-03f3-4791-85c7-95134150dc71-catalog-content\") pod \"certified-operators-qjcxf\" (UID: \"7e335707-03f3-4791-85c7-95134150dc71\") " pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.059229 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.059320 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e335707-03f3-4791-85c7-95134150dc71-utilities\") pod \"certified-operators-qjcxf\" (UID: \"7e335707-03f3-4791-85c7-95134150dc71\") " pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.059532 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.559519023 +0000 UTC m=+158.339564431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.130447 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7b742" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.162879 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.163063 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.663036203 +0000 UTC m=+158.443081611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.163104 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e335707-03f3-4791-85c7-95134150dc71-utilities\") pod \"certified-operators-qjcxf\" (UID: \"7e335707-03f3-4791-85c7-95134150dc71\") " pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.163148 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pfjk\" (UniqueName: \"kubernetes.io/projected/7e335707-03f3-4791-85c7-95134150dc71-kube-api-access-9pfjk\") pod \"certified-operators-qjcxf\" (UID: \"7e335707-03f3-4791-85c7-95134150dc71\") " pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.163213 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e335707-03f3-4791-85c7-95134150dc71-catalog-content\") pod \"certified-operators-qjcxf\" (UID: \"7e335707-03f3-4791-85c7-95134150dc71\") " pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.163233 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.163635 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e335707-03f3-4791-85c7-95134150dc71-utilities\") pod \"certified-operators-qjcxf\" (UID: \"7e335707-03f3-4791-85c7-95134150dc71\") " pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.164187 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e335707-03f3-4791-85c7-95134150dc71-catalog-content\") pod \"certified-operators-qjcxf\" (UID: \"7e335707-03f3-4791-85c7-95134150dc71\") " pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.164500 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.664487449 +0000 UTC m=+158.444532857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.196446 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kb7jz"] Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.197416 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.206479 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" event={"ID":"eda7a295-93d7-4a47-915b-ba8b58bfb325","Type":"ContainerStarted","Data":"bbaff2ed66f4189f5b615a4ed6ed5285f38e2430aee9bc2628a428b97b17a77f"} Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.208359 4697 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m6zk6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.208413 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" podUID="adf825e0-c430-439f-9d7c-55b7582f1b54" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.208663 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pfjk\" (UniqueName: \"kubernetes.io/projected/7e335707-03f3-4791-85c7-95134150dc71-kube-api-access-9pfjk\") pod \"certified-operators-qjcxf\" (UID: \"7e335707-03f3-4791-85c7-95134150dc71\") " pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.209014 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.210972 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kb7jz"] Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.265001 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.265145 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.765119967 +0000 UTC m=+158.545165375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.265336 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9652ea9-a08a-4700-9a77-00b227062e91-utilities\") pod \"community-operators-kb7jz\" (UID: \"e9652ea9-a08a-4700-9a77-00b227062e91\") " pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.265365 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2trq4\" (UniqueName: \"kubernetes.io/projected/e9652ea9-a08a-4700-9a77-00b227062e91-kube-api-access-2trq4\") pod \"community-operators-kb7jz\" (UID: \"e9652ea9-a08a-4700-9a77-00b227062e91\") " pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.265591 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9652ea9-a08a-4700-9a77-00b227062e91-catalog-content\") pod \"community-operators-kb7jz\" (UID: \"e9652ea9-a08a-4700-9a77-00b227062e91\") " pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.266084 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.281654 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.781622919 +0000 UTC m=+158.561668327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.329395 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.370883 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.371075 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9652ea9-a08a-4700-9a77-00b227062e91-utilities\") pod \"community-operators-kb7jz\" (UID: \"e9652ea9-a08a-4700-9a77-00b227062e91\") " pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.371103 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2trq4\" (UniqueName: \"kubernetes.io/projected/e9652ea9-a08a-4700-9a77-00b227062e91-kube-api-access-2trq4\") pod \"community-operators-kb7jz\" (UID: \"e9652ea9-a08a-4700-9a77-00b227062e91\") " pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.371160 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.871115983 +0000 UTC m=+158.651161391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.371254 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9652ea9-a08a-4700-9a77-00b227062e91-catalog-content\") pod \"community-operators-kb7jz\" (UID: \"e9652ea9-a08a-4700-9a77-00b227062e91\") " pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.371483 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.371804 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.871796869 +0000 UTC m=+158.651842277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.371904 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9652ea9-a08a-4700-9a77-00b227062e91-utilities\") pod \"community-operators-kb7jz\" (UID: \"e9652ea9-a08a-4700-9a77-00b227062e91\") " pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.376312 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9652ea9-a08a-4700-9a77-00b227062e91-catalog-content\") pod \"community-operators-kb7jz\" (UID: \"e9652ea9-a08a-4700-9a77-00b227062e91\") " pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.393633 4697 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.394076 4697 patch_prober.go:28] interesting pod/router-default-5444994796-rxpbh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 16:34:10 crc kubenswrapper[4697]: [-]has-synced failed: reason withheld Feb 20 16:34:10 crc kubenswrapper[4697]: [+]process-running ok Feb 20 16:34:10 crc kubenswrapper[4697]: healthz check failed Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.394120 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rxpbh" podUID="5bd2236d-2830-47cd-9eb8-9d9f07c821b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.401593 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k68tp"] Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.404678 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.407177 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2trq4\" (UniqueName: \"kubernetes.io/projected/e9652ea9-a08a-4700-9a77-00b227062e91-kube-api-access-2trq4\") pod \"community-operators-kb7jz\" (UID: \"e9652ea9-a08a-4700-9a77-00b227062e91\") " pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.431418 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k68tp"] Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.478380 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.478624 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.978588776 +0000 UTC m=+158.758634184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.479172 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb428f3-f886-42e8-91cc-fba15966023c-utilities\") pod \"certified-operators-k68tp\" (UID: \"efb428f3-f886-42e8-91cc-fba15966023c\") " pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.479200 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hsln\" (UniqueName: \"kubernetes.io/projected/efb428f3-f886-42e8-91cc-fba15966023c-kube-api-access-9hsln\") pod \"certified-operators-k68tp\" (UID: \"efb428f3-f886-42e8-91cc-fba15966023c\") " pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.479265 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb428f3-f886-42e8-91cc-fba15966023c-catalog-content\") pod \"certified-operators-k68tp\" (UID: \"efb428f3-f886-42e8-91cc-fba15966023c\") " pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.479313 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.479736 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:10.97971792 +0000 UTC m=+158.759763328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.529139 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.558626 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xmjwk" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.581146 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.581307 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb428f3-f886-42e8-91cc-fba15966023c-catalog-content\") pod \"certified-operators-k68tp\" (UID: \"efb428f3-f886-42e8-91cc-fba15966023c\") " pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.581386 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb428f3-f886-42e8-91cc-fba15966023c-utilities\") pod \"certified-operators-k68tp\" (UID: \"efb428f3-f886-42e8-91cc-fba15966023c\") " pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.581404 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hsln\" (UniqueName: \"kubernetes.io/projected/efb428f3-f886-42e8-91cc-fba15966023c-kube-api-access-9hsln\") pod \"certified-operators-k68tp\" (UID: \"efb428f3-f886-42e8-91cc-fba15966023c\") " pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.581723 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:11.08170959 +0000 UTC m=+158.861754998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.582580 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb428f3-f886-42e8-91cc-fba15966023c-catalog-content\") pod \"certified-operators-k68tp\" (UID: \"efb428f3-f886-42e8-91cc-fba15966023c\") " pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.582811 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb428f3-f886-42e8-91cc-fba15966023c-utilities\") pod \"certified-operators-k68tp\" (UID: \"efb428f3-f886-42e8-91cc-fba15966023c\") " pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.612189 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hsln\" (UniqueName: \"kubernetes.io/projected/efb428f3-f886-42e8-91cc-fba15966023c-kube-api-access-9hsln\") pod \"certified-operators-k68tp\" (UID: \"efb428f3-f886-42e8-91cc-fba15966023c\") " pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.654685 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.684587 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.685599 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:11.185582143 +0000 UTC m=+158.965627551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.744819 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjcxf"] Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.760248 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.770116 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7b742"] Feb 20 16:34:10 crc kubenswrapper[4697]: W0220 16:34:10.771673 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e335707_03f3_4791_85c7_95134150dc71.slice/crio-871317ba189169fdcb506bab56dc9f3d8ddc06baf081481d55d03553b2c1f592 WatchSource:0}: Error finding container 871317ba189169fdcb506bab56dc9f3d8ddc06baf081481d55d03553b2c1f592: Status 404 returned error can't find the container with id 871317ba189169fdcb506bab56dc9f3d8ddc06baf081481d55d03553b2c1f592 Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.788565 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.788865 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:11.288839753 +0000 UTC m=+159.068885171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.891243 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:10 crc kubenswrapper[4697]: E0220 16:34:10.891597 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:11.391572902 +0000 UTC m=+159.171618310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:10 crc kubenswrapper[4697]: I0220 16:34:10.911669 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kb7jz"] Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.002710 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:11 crc kubenswrapper[4697]: E0220 16:34:11.003225 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 16:34:11.503199747 +0000 UTC m=+159.283245145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.055580 4697 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-20T16:34:10.39365585Z","Handler":null,"Name":""} Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.103769 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:11 crc kubenswrapper[4697]: E0220 16:34:11.104137 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 16:34:11.604122116 +0000 UTC m=+159.384167524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s4bgz" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.159960 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k68tp"] Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.160160 4697 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.160182 4697 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.204907 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.211784 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.227595 4697 generic.go:334] "Generic (PLEG): container finished" podID="e9652ea9-a08a-4700-9a77-00b227062e91" containerID="298f7d7f37a7452693a57cac7a66e42ee3bebc86d248ac8d173d617950555835" exitCode=0 Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.227655 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kb7jz" event={"ID":"e9652ea9-a08a-4700-9a77-00b227062e91","Type":"ContainerDied","Data":"298f7d7f37a7452693a57cac7a66e42ee3bebc86d248ac8d173d617950555835"} Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.227684 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kb7jz" event={"ID":"e9652ea9-a08a-4700-9a77-00b227062e91","Type":"ContainerStarted","Data":"7ab933464df224c5e64c57860f8b0234ffcbed328803e0b1f4a1f40c3f28eae2"} Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.229936 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.231864 4697 generic.go:334] "Generic (PLEG): container finished" podID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" containerID="16942f6820ce80eb288c38fcdca5fb819586b98462d96099abe6d6ec18b99e6f" exitCode=0 Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.231908 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b742" event={"ID":"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3","Type":"ContainerDied","Data":"16942f6820ce80eb288c38fcdca5fb819586b98462d96099abe6d6ec18b99e6f"} Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.231926 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b742" event={"ID":"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3","Type":"ContainerStarted","Data":"7d1965fdec4457cdf648de5e19bba67342e81a407c722965f4e57738aca165d2"} Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.234507 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k68tp" event={"ID":"efb428f3-f886-42e8-91cc-fba15966023c","Type":"ContainerStarted","Data":"969a8728439fc3786328e92ae3a2cad29914f5d00968e004c38de3ddc2f020b9"} Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.237299 4697 generic.go:334] "Generic (PLEG): container finished" podID="7e335707-03f3-4791-85c7-95134150dc71" containerID="71660c9cd6894146428f2b1cc14feb5449a75ea175f32787fed93797acf489a1" exitCode=0 Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.237334 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcxf" event={"ID":"7e335707-03f3-4791-85c7-95134150dc71","Type":"ContainerDied","Data":"71660c9cd6894146428f2b1cc14feb5449a75ea175f32787fed93797acf489a1"} Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.237347 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcxf" event={"ID":"7e335707-03f3-4791-85c7-95134150dc71","Type":"ContainerStarted","Data":"871317ba189169fdcb506bab56dc9f3d8ddc06baf081481d55d03553b2c1f592"} Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.242740 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" event={"ID":"eda7a295-93d7-4a47-915b-ba8b58bfb325","Type":"ContainerStarted","Data":"c88fc2609f45aea98ecb5e564ce914683f2184f1c1c321beab2a9ad626eead18"} Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.242785 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" event={"ID":"eda7a295-93d7-4a47-915b-ba8b58bfb325","Type":"ContainerStarted","Data":"9ad2945436ab0f46a33d14d485d0b58312a9685ad240967c51fa4f72f2166c9c"} Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.308071 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.312610 4697 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.312662 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.330636 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5vp6h" podStartSLOduration=10.330618832 podStartE2EDuration="10.330618832s" podCreationTimestamp="2026-02-20 16:34:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:11.329130124 +0000 UTC m=+159.109175542" watchObservedRunningTime="2026-02-20 16:34:11.330618832 +0000 UTC m=+159.110664240" Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.339286 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s4bgz\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.401586 4697 patch_prober.go:28] interesting pod/router-default-5444994796-rxpbh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 16:34:11 crc kubenswrapper[4697]: [-]has-synced failed: reason withheld Feb 20 16:34:11 crc kubenswrapper[4697]: [+]process-running ok Feb 20 16:34:11 crc kubenswrapper[4697]: healthz check failed Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.401695 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rxpbh" podUID="5bd2236d-2830-47cd-9eb8-9d9f07c821b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.492363 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.731976 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s4bgz"] Feb 20 16:34:11 crc kubenswrapper[4697]: W0220 16:34:11.737799 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf1acafc_f7fd_49ab_b574_9aed683db705.slice/crio-3f9696f3e08dadd4eb55fad2e6bb7f5867baca0c7de39bf4cd71633e2481e99d WatchSource:0}: Error finding container 3f9696f3e08dadd4eb55fad2e6bb7f5867baca0c7de39bf4cd71633e2481e99d: Status 404 returned error can't find the container with id 3f9696f3e08dadd4eb55fad2e6bb7f5867baca0c7de39bf4cd71633e2481e99d Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.795706 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5bghw"] Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.797078 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.803713 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.869854 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5bghw"] Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.916404 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frqg5\" (UniqueName: \"kubernetes.io/projected/aa8bd21f-2da9-4ece-917f-091ada68a2bd-kube-api-access-frqg5\") pod \"redhat-marketplace-5bghw\" (UID: \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\") " pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.916492 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa8bd21f-2da9-4ece-917f-091ada68a2bd-utilities\") pod \"redhat-marketplace-5bghw\" (UID: \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\") " pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:11 crc kubenswrapper[4697]: I0220 16:34:11.916522 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa8bd21f-2da9-4ece-917f-091ada68a2bd-catalog-content\") pod \"redhat-marketplace-5bghw\" (UID: \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\") " pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.030144 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa8bd21f-2da9-4ece-917f-091ada68a2bd-utilities\") pod \"redhat-marketplace-5bghw\" (UID: \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\") " pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.030201 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa8bd21f-2da9-4ece-917f-091ada68a2bd-catalog-content\") pod \"redhat-marketplace-5bghw\" (UID: \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\") " pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.030315 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frqg5\" (UniqueName: \"kubernetes.io/projected/aa8bd21f-2da9-4ece-917f-091ada68a2bd-kube-api-access-frqg5\") pod \"redhat-marketplace-5bghw\" (UID: \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\") " pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.031591 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa8bd21f-2da9-4ece-917f-091ada68a2bd-utilities\") pod \"redhat-marketplace-5bghw\" (UID: \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\") " pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.031698 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa8bd21f-2da9-4ece-917f-091ada68a2bd-catalog-content\") pod \"redhat-marketplace-5bghw\" (UID: \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\") " pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.072467 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frqg5\" (UniqueName: \"kubernetes.io/projected/aa8bd21f-2da9-4ece-917f-091ada68a2bd-kube-api-access-frqg5\") pod \"redhat-marketplace-5bghw\" (UID: \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\") " pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.118333 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.195648 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rmfdq"] Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.196641 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.222162 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmfdq"] Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.257676 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" event={"ID":"bf1acafc-f7fd-49ab-b574-9aed683db705","Type":"ContainerStarted","Data":"e3d5e881c62660cf3778ca93631902e664dcbe8ca873445984c3be1904ff2d82"} Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.257745 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" event={"ID":"bf1acafc-f7fd-49ab-b574-9aed683db705","Type":"ContainerStarted","Data":"3f9696f3e08dadd4eb55fad2e6bb7f5867baca0c7de39bf4cd71633e2481e99d"} Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.258350 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.261465 4697 generic.go:334] "Generic (PLEG): container finished" podID="efb428f3-f886-42e8-91cc-fba15966023c" containerID="1d0a2b06285771bd6209a3b3a457added3bd9e3dfa1f623b9cf4c24b49ea0d5d" exitCode=0 Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.262628 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k68tp" event={"ID":"efb428f3-f886-42e8-91cc-fba15966023c","Type":"ContainerDied","Data":"1d0a2b06285771bd6209a3b3a457added3bd9e3dfa1f623b9cf4c24b49ea0d5d"} Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.297759 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" podStartSLOduration=136.297740358 podStartE2EDuration="2m16.297740358s" podCreationTimestamp="2026-02-20 16:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:12.293742663 +0000 UTC m=+160.073788071" watchObservedRunningTime="2026-02-20 16:34:12.297740358 +0000 UTC m=+160.077785766" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.314049 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.314686 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.333294 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-utilities\") pod \"redhat-marketplace-rmfdq\" (UID: \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\") " pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.333399 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-catalog-content\") pod \"redhat-marketplace-rmfdq\" (UID: \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\") " pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.333416 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9k7j\" (UniqueName: \"kubernetes.io/projected/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-kube-api-access-l9k7j\") pod \"redhat-marketplace-rmfdq\" (UID: \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\") " pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.336476 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.336691 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.348455 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.396546 4697 patch_prober.go:28] interesting pod/router-default-5444994796-rxpbh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 16:34:12 crc kubenswrapper[4697]: [-]has-synced failed: reason withheld Feb 20 16:34:12 crc kubenswrapper[4697]: [+]process-running ok Feb 20 16:34:12 crc kubenswrapper[4697]: healthz check failed Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.396615 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rxpbh" podUID="5bd2236d-2830-47cd-9eb8-9d9f07c821b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.436944 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-utilities\") pod \"redhat-marketplace-rmfdq\" (UID: \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\") " pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.437075 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.437119 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9k7j\" (UniqueName: \"kubernetes.io/projected/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-kube-api-access-l9k7j\") pod \"redhat-marketplace-rmfdq\" (UID: \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\") " pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.437145 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-catalog-content\") pod \"redhat-marketplace-rmfdq\" (UID: \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\") " pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.437185 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.437534 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-utilities\") pod \"redhat-marketplace-rmfdq\" (UID: \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\") " pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.438348 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-catalog-content\") pod \"redhat-marketplace-rmfdq\" (UID: \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\") " pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.458280 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9k7j\" (UniqueName: \"kubernetes.io/projected/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-kube-api-access-l9k7j\") pod \"redhat-marketplace-rmfdq\" (UID: \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\") " pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.538320 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.538506 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.539091 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.554340 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.562890 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.654309 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.670150 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5bghw"] Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.921051 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 20 16:34:12 crc kubenswrapper[4697]: I0220 16:34:12.961585 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmfdq"] Feb 20 16:34:12 crc kubenswrapper[4697]: W0220 16:34:12.989577 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d026728_bdc4_4f2e_8162_fca5d1ba93ca.slice/crio-7bf2640bd70d7b5bab1f93a2f1a2d2240d2e6e543b214f36d877c63a6beca2b5 WatchSource:0}: Error finding container 7bf2640bd70d7b5bab1f93a2f1a2d2240d2e6e543b214f36d877c63a6beca2b5: Status 404 returned error can't find the container with id 7bf2640bd70d7b5bab1f93a2f1a2d2240d2e6e543b214f36d877c63a6beca2b5 Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.116637 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.117350 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.119684 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.119858 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.121483 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.201807 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4zh8s"] Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.203361 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.205824 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.207181 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4zh8s"] Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.219835 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 16:34:13 crc kubenswrapper[4697]: W0220 16:34:13.247764 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poded07cfd3_4a3a_4bd5_a701_799aeca8b6fd.slice/crio-0b853afdea84b18af5e5de88af512586c6f5ff2873971a089461e02d00e3e8db WatchSource:0}: Error finding container 0b853afdea84b18af5e5de88af512586c6f5ff2873971a089461e02d00e3e8db: Status 404 returned error can't find the container with id 0b853afdea84b18af5e5de88af512586c6f5ff2873971a089461e02d00e3e8db Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.254973 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36704629-b795-40ba-b710-ae563e2fb49b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"36704629-b795-40ba-b710-ae563e2fb49b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.255053 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36704629-b795-40ba-b710-ae563e2fb49b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"36704629-b795-40ba-b710-ae563e2fb49b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.288222 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmfdq" event={"ID":"7d026728-bdc4-4f2e-8162-fca5d1ba93ca","Type":"ContainerStarted","Data":"e2a482bbe74df780275cdf406dc55f7b2ffdb314f2242ef1f458f7b432b9c8f8"} Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.288310 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmfdq" event={"ID":"7d026728-bdc4-4f2e-8162-fca5d1ba93ca","Type":"ContainerStarted","Data":"7bf2640bd70d7b5bab1f93a2f1a2d2240d2e6e543b214f36d877c63a6beca2b5"} Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.294370 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd","Type":"ContainerStarted","Data":"0b853afdea84b18af5e5de88af512586c6f5ff2873971a089461e02d00e3e8db"} Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.297664 4697 generic.go:334] "Generic (PLEG): container finished" podID="aa8bd21f-2da9-4ece-917f-091ada68a2bd" containerID="8cd83e91dfafd1e82abb084af428146d7885d381d0e7da57e91488fde863d1f2" exitCode=0 Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.297732 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5bghw" event={"ID":"aa8bd21f-2da9-4ece-917f-091ada68a2bd","Type":"ContainerDied","Data":"8cd83e91dfafd1e82abb084af428146d7885d381d0e7da57e91488fde863d1f2"} Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.297769 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5bghw" event={"ID":"aa8bd21f-2da9-4ece-917f-091ada68a2bd","Type":"ContainerStarted","Data":"8e0fc9d32d791f0ea564662c95ff892b7b7c9c1badbf1254ad2f3187c8ca4119"} Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.301422 4697 generic.go:334] "Generic (PLEG): container finished" podID="95906860-29b3-49a2-b903-37b9a5a808a5" containerID="4464255c5a1c57a8e2a67aaf79c6b738fc5cacdb33c0ae9c5950c814b4acf82a" exitCode=0 Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.301526 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" event={"ID":"95906860-29b3-49a2-b903-37b9a5a808a5","Type":"ContainerDied","Data":"4464255c5a1c57a8e2a67aaf79c6b738fc5cacdb33c0ae9c5950c814b4acf82a"} Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.358843 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-utilities\") pod \"redhat-operators-4zh8s\" (UID: \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\") " pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.359292 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-catalog-content\") pod \"redhat-operators-4zh8s\" (UID: \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\") " pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.359370 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n4k7\" (UniqueName: \"kubernetes.io/projected/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-kube-api-access-4n4k7\") pod \"redhat-operators-4zh8s\" (UID: \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\") " pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.359398 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36704629-b795-40ba-b710-ae563e2fb49b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"36704629-b795-40ba-b710-ae563e2fb49b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.359443 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36704629-b795-40ba-b710-ae563e2fb49b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"36704629-b795-40ba-b710-ae563e2fb49b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.362290 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36704629-b795-40ba-b710-ae563e2fb49b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"36704629-b795-40ba-b710-ae563e2fb49b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.380150 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36704629-b795-40ba-b710-ae563e2fb49b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"36704629-b795-40ba-b710-ae563e2fb49b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.394497 4697 patch_prober.go:28] interesting pod/router-default-5444994796-rxpbh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 16:34:13 crc kubenswrapper[4697]: [-]has-synced failed: reason withheld Feb 20 16:34:13 crc kubenswrapper[4697]: [+]process-running ok Feb 20 16:34:13 crc kubenswrapper[4697]: healthz check failed Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.394573 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rxpbh" podUID="5bd2236d-2830-47cd-9eb8-9d9f07c821b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.452684 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.461571 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-utilities\") pod \"redhat-operators-4zh8s\" (UID: \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\") " pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.461655 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-catalog-content\") pod \"redhat-operators-4zh8s\" (UID: \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\") " pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.462184 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-utilities\") pod \"redhat-operators-4zh8s\" (UID: \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\") " pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.462271 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-catalog-content\") pod \"redhat-operators-4zh8s\" (UID: \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\") " pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.463372 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n4k7\" (UniqueName: \"kubernetes.io/projected/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-kube-api-access-4n4k7\") pod \"redhat-operators-4zh8s\" (UID: \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\") " pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.477045 4697 patch_prober.go:28] interesting pod/downloads-7954f5f757-t7pft container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.477116 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-t7pft" podUID="314a7b24-36b9-41de-9ec3-3c229fc43b3d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.477386 4697 patch_prober.go:28] interesting pod/downloads-7954f5f757-t7pft container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.477455 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t7pft" podUID="314a7b24-36b9-41de-9ec3-3c229fc43b3d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.481030 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n4k7\" (UniqueName: \"kubernetes.io/projected/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-kube-api-access-4n4k7\") pod \"redhat-operators-4zh8s\" (UID: \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\") " pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.491622 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.496269 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nct46" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.542394 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.608922 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j7c4d"] Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.610127 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.621941 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j7c4d"] Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.776046 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-utilities\") pod \"redhat-operators-j7c4d\" (UID: \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\") " pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.776567 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-catalog-content\") pod \"redhat-operators-j7c4d\" (UID: \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\") " pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.776618 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqxrp\" (UniqueName: \"kubernetes.io/projected/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-kube-api-access-jqxrp\") pod \"redhat-operators-j7c4d\" (UID: \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\") " pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.831379 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.881860 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-catalog-content\") pod \"redhat-operators-j7c4d\" (UID: \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\") " pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.881918 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqxrp\" (UniqueName: \"kubernetes.io/projected/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-kube-api-access-jqxrp\") pod \"redhat-operators-j7c4d\" (UID: \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\") " pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.882018 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-utilities\") pod \"redhat-operators-j7c4d\" (UID: \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\") " pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.882384 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-catalog-content\") pod \"redhat-operators-j7c4d\" (UID: \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\") " pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.891345 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-utilities\") pod \"redhat-operators-j7c4d\" (UID: \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\") " pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.910772 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqxrp\" (UniqueName: \"kubernetes.io/projected/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-kube-api-access-jqxrp\") pod \"redhat-operators-j7c4d\" (UID: \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\") " pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.921972 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.922010 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.931100 4697 patch_prober.go:28] interesting pod/console-f9d7485db-txsqk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.931148 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-txsqk" podUID="70f9d3b5-82e4-47b2-ba65-88980dc9b401" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 20 16:34:13 crc kubenswrapper[4697]: I0220 16:34:13.974325 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.209845 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4zh8s"] Feb 20 16:34:14 crc kubenswrapper[4697]: W0220 16:34:14.223646 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda54aa135_a749_4f61_8bc7_bfea1dbd70dd.slice/crio-403749bc2c9e7f1a7790937e5e7a1f059a6078bb14ce9b87790d249827906471 WatchSource:0}: Error finding container 403749bc2c9e7f1a7790937e5e7a1f059a6078bb14ce9b87790d249827906471: Status 404 returned error can't find the container with id 403749bc2c9e7f1a7790937e5e7a1f059a6078bb14ce9b87790d249827906471 Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.279953 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.285954 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-mqvww" Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.311669 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd","Type":"ContainerStarted","Data":"3c74db434ef805e877800cbbf0f19cdc6cd662f00c7dfbf23328a11b34d659bb"} Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.313533 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"36704629-b795-40ba-b710-ae563e2fb49b","Type":"ContainerStarted","Data":"e8d04a67744dd8ccfd5c23057d42f541c072d35f92b2b2da4b4b3d530b8b2536"} Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.323119 4697 generic.go:334] "Generic (PLEG): container finished" podID="7d026728-bdc4-4f2e-8162-fca5d1ba93ca" containerID="e2a482bbe74df780275cdf406dc55f7b2ffdb314f2242ef1f458f7b432b9c8f8" exitCode=0 Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.323226 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmfdq" event={"ID":"7d026728-bdc4-4f2e-8162-fca5d1ba93ca","Type":"ContainerDied","Data":"e2a482bbe74df780275cdf406dc55f7b2ffdb314f2242ef1f458f7b432b9c8f8"} Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.352070 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zh8s" event={"ID":"a54aa135-a749-4f61-8bc7-bfea1dbd70dd","Type":"ContainerStarted","Data":"403749bc2c9e7f1a7790937e5e7a1f059a6078bb14ce9b87790d249827906471"} Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.399932 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.420569 4697 patch_prober.go:28] interesting pod/router-default-5444994796-rxpbh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 16:34:14 crc kubenswrapper[4697]: [-]has-synced failed: reason withheld Feb 20 16:34:14 crc kubenswrapper[4697]: [+]process-running ok Feb 20 16:34:14 crc kubenswrapper[4697]: healthz check failed Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.420614 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rxpbh" podUID="5bd2236d-2830-47cd-9eb8-9d9f07c821b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.437175 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.437158516 podStartE2EDuration="2.437158516s" podCreationTimestamp="2026-02-20 16:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:14.436443488 +0000 UTC m=+162.216488896" watchObservedRunningTime="2026-02-20 16:34:14.437158516 +0000 UTC m=+162.217203924" Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.532391 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j7c4d"] Feb 20 16:34:14 crc kubenswrapper[4697]: W0220 16:34:14.695361 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb38bb570_3ff7_4b0c_be67_9a40ff0b3b1c.slice/crio-365ed9085b0b33aab41edd4c8999d68555a76a510f32d37407a7d2e7450b14a5 WatchSource:0}: Error finding container 365ed9085b0b33aab41edd4c8999d68555a76a510f32d37407a7d2e7450b14a5: Status 404 returned error can't find the container with id 365ed9085b0b33aab41edd4c8999d68555a76a510f32d37407a7d2e7450b14a5 Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.736343 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:34:14 crc kubenswrapper[4697]: I0220 16:34:14.909446 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.041885 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95906860-29b3-49a2-b903-37b9a5a808a5-secret-volume\") pod \"95906860-29b3-49a2-b903-37b9a5a808a5\" (UID: \"95906860-29b3-49a2-b903-37b9a5a808a5\") " Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.041980 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95906860-29b3-49a2-b903-37b9a5a808a5-config-volume\") pod \"95906860-29b3-49a2-b903-37b9a5a808a5\" (UID: \"95906860-29b3-49a2-b903-37b9a5a808a5\") " Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.042025 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c2tx\" (UniqueName: \"kubernetes.io/projected/95906860-29b3-49a2-b903-37b9a5a808a5-kube-api-access-2c2tx\") pod \"95906860-29b3-49a2-b903-37b9a5a808a5\" (UID: \"95906860-29b3-49a2-b903-37b9a5a808a5\") " Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.043365 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95906860-29b3-49a2-b903-37b9a5a808a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "95906860-29b3-49a2-b903-37b9a5a808a5" (UID: "95906860-29b3-49a2-b903-37b9a5a808a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.048737 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95906860-29b3-49a2-b903-37b9a5a808a5-kube-api-access-2c2tx" (OuterVolumeSpecName: "kube-api-access-2c2tx") pod "95906860-29b3-49a2-b903-37b9a5a808a5" (UID: "95906860-29b3-49a2-b903-37b9a5a808a5"). InnerVolumeSpecName "kube-api-access-2c2tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.050208 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95906860-29b3-49a2-b903-37b9a5a808a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "95906860-29b3-49a2-b903-37b9a5a808a5" (UID: "95906860-29b3-49a2-b903-37b9a5a808a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.143835 4697 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95906860-29b3-49a2-b903-37b9a5a808a5-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.143863 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c2tx\" (UniqueName: \"kubernetes.io/projected/95906860-29b3-49a2-b903-37b9a5a808a5-kube-api-access-2c2tx\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.143873 4697 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95906860-29b3-49a2-b903-37b9a5a808a5-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.390870 4697 generic.go:334] "Generic (PLEG): container finished" podID="ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd" containerID="3c74db434ef805e877800cbbf0f19cdc6cd662f00c7dfbf23328a11b34d659bb" exitCode=0 Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.391370 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd","Type":"ContainerDied","Data":"3c74db434ef805e877800cbbf0f19cdc6cd662f00c7dfbf23328a11b34d659bb"} Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.400679 4697 patch_prober.go:28] interesting pod/router-default-5444994796-rxpbh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 16:34:15 crc kubenswrapper[4697]: [-]has-synced failed: reason withheld Feb 20 16:34:15 crc kubenswrapper[4697]: [+]process-running ok Feb 20 16:34:15 crc kubenswrapper[4697]: healthz check failed Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.400830 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rxpbh" podUID="5bd2236d-2830-47cd-9eb8-9d9f07c821b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.413143 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" event={"ID":"95906860-29b3-49a2-b903-37b9a5a808a5","Type":"ContainerDied","Data":"7a53567664aca3d284841e81dbcc2728c226ac72c140554e4350c42b226656b5"} Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.413181 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a53567664aca3d284841e81dbcc2728c226ac72c140554e4350c42b226656b5" Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.413267 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8" Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.424980 4697 generic.go:334] "Generic (PLEG): container finished" podID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" containerID="cde1fd3a62bee76a5a86951f56f281122c088f3e9d6f9ebefb3ceffea329ff84" exitCode=0 Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.425054 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7c4d" event={"ID":"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c","Type":"ContainerDied","Data":"cde1fd3a62bee76a5a86951f56f281122c088f3e9d6f9ebefb3ceffea329ff84"} Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.425084 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7c4d" event={"ID":"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c","Type":"ContainerStarted","Data":"365ed9085b0b33aab41edd4c8999d68555a76a510f32d37407a7d2e7450b14a5"} Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.432756 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"36704629-b795-40ba-b710-ae563e2fb49b","Type":"ContainerStarted","Data":"f2738262b14356a78caa6f16244ea6c1a7a844b22c8cee7faf5261fd90ad6431"} Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.436492 4697 generic.go:334] "Generic (PLEG): container finished" podID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" containerID="7467c838bd30986fafdc9bca0f64e3cfbe67ba7f0eddf735e9fe77439e99277a" exitCode=0 Feb 20 16:34:15 crc kubenswrapper[4697]: I0220 16:34:15.436549 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zh8s" event={"ID":"a54aa135-a749-4f61-8bc7-bfea1dbd70dd","Type":"ContainerDied","Data":"7467c838bd30986fafdc9bca0f64e3cfbe67ba7f0eddf735e9fe77439e99277a"} Feb 20 16:34:16 crc kubenswrapper[4697]: I0220 16:34:16.393489 4697 patch_prober.go:28] interesting pod/router-default-5444994796-rxpbh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 16:34:16 crc kubenswrapper[4697]: [+]has-synced ok Feb 20 16:34:16 crc kubenswrapper[4697]: [+]process-running ok Feb 20 16:34:16 crc kubenswrapper[4697]: healthz check failed Feb 20 16:34:16 crc kubenswrapper[4697]: I0220 16:34:16.393547 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rxpbh" podUID="5bd2236d-2830-47cd-9eb8-9d9f07c821b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 16:34:16 crc kubenswrapper[4697]: I0220 16:34:16.470276 4697 generic.go:334] "Generic (PLEG): container finished" podID="36704629-b795-40ba-b710-ae563e2fb49b" containerID="f2738262b14356a78caa6f16244ea6c1a7a844b22c8cee7faf5261fd90ad6431" exitCode=0 Feb 20 16:34:16 crc kubenswrapper[4697]: I0220 16:34:16.470388 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"36704629-b795-40ba-b710-ae563e2fb49b","Type":"ContainerDied","Data":"f2738262b14356a78caa6f16244ea6c1a7a844b22c8cee7faf5261fd90ad6431"} Feb 20 16:34:16 crc kubenswrapper[4697]: I0220 16:34:16.794761 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 16:34:16 crc kubenswrapper[4697]: I0220 16:34:16.977511 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd-kube-api-access\") pod \"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd\" (UID: \"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd\") " Feb 20 16:34:16 crc kubenswrapper[4697]: I0220 16:34:16.977596 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd-kubelet-dir\") pod \"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd\" (UID: \"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd\") " Feb 20 16:34:16 crc kubenswrapper[4697]: I0220 16:34:16.978108 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd" (UID: "ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:34:16 crc kubenswrapper[4697]: I0220 16:34:16.982819 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd" (UID: "ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:34:17 crc kubenswrapper[4697]: I0220 16:34:17.079240 4697 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:17 crc kubenswrapper[4697]: I0220 16:34:17.079277 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:17 crc kubenswrapper[4697]: I0220 16:34:17.394478 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:17 crc kubenswrapper[4697]: I0220 16:34:17.397310 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-rxpbh" Feb 20 16:34:17 crc kubenswrapper[4697]: I0220 16:34:17.484943 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd","Type":"ContainerDied","Data":"0b853afdea84b18af5e5de88af512586c6f5ff2873971a089461e02d00e3e8db"} Feb 20 16:34:17 crc kubenswrapper[4697]: I0220 16:34:17.485014 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b853afdea84b18af5e5de88af512586c6f5ff2873971a089461e02d00e3e8db" Feb 20 16:34:17 crc kubenswrapper[4697]: I0220 16:34:17.485578 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 16:34:18 crc kubenswrapper[4697]: I0220 16:34:18.426901 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:34:18 crc kubenswrapper[4697]: I0220 16:34:18.432895 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aff33f1-a871-41df-a6f1-fd7146e23a9c-metrics-certs\") pod \"network-metrics-daemon-nskrw\" (UID: \"0aff33f1-a871-41df-a6f1-fd7146e23a9c\") " pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:34:18 crc kubenswrapper[4697]: I0220 16:34:18.535087 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nskrw" Feb 20 16:34:19 crc kubenswrapper[4697]: I0220 16:34:19.549011 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g58sg" Feb 20 16:34:23 crc kubenswrapper[4697]: I0220 16:34:23.490581 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-t7pft" Feb 20 16:34:23 crc kubenswrapper[4697]: I0220 16:34:23.933346 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:23 crc kubenswrapper[4697]: I0220 16:34:23.937008 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:34:27 crc kubenswrapper[4697]: I0220 16:34:27.420946 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 16:34:27 crc kubenswrapper[4697]: I0220 16:34:27.567958 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"36704629-b795-40ba-b710-ae563e2fb49b","Type":"ContainerDied","Data":"e8d04a67744dd8ccfd5c23057d42f541c072d35f92b2b2da4b4b3d530b8b2536"} Feb 20 16:34:27 crc kubenswrapper[4697]: I0220 16:34:27.568001 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8d04a67744dd8ccfd5c23057d42f541c072d35f92b2b2da4b4b3d530b8b2536" Feb 20 16:34:27 crc kubenswrapper[4697]: I0220 16:34:27.568035 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 16:34:27 crc kubenswrapper[4697]: I0220 16:34:27.616392 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36704629-b795-40ba-b710-ae563e2fb49b-kubelet-dir\") pod \"36704629-b795-40ba-b710-ae563e2fb49b\" (UID: \"36704629-b795-40ba-b710-ae563e2fb49b\") " Feb 20 16:34:27 crc kubenswrapper[4697]: I0220 16:34:27.616505 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36704629-b795-40ba-b710-ae563e2fb49b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "36704629-b795-40ba-b710-ae563e2fb49b" (UID: "36704629-b795-40ba-b710-ae563e2fb49b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:34:27 crc kubenswrapper[4697]: I0220 16:34:27.616546 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36704629-b795-40ba-b710-ae563e2fb49b-kube-api-access\") pod \"36704629-b795-40ba-b710-ae563e2fb49b\" (UID: \"36704629-b795-40ba-b710-ae563e2fb49b\") " Feb 20 16:34:27 crc kubenswrapper[4697]: I0220 16:34:27.616819 4697 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36704629-b795-40ba-b710-ae563e2fb49b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:27 crc kubenswrapper[4697]: I0220 16:34:27.634805 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36704629-b795-40ba-b710-ae563e2fb49b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "36704629-b795-40ba-b710-ae563e2fb49b" (UID: "36704629-b795-40ba-b710-ae563e2fb49b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:34:27 crc kubenswrapper[4697]: I0220 16:34:27.718552 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36704629-b795-40ba-b710-ae563e2fb49b-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:30 crc kubenswrapper[4697]: I0220 16:34:30.121491 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 16:34:31 crc kubenswrapper[4697]: I0220 16:34:31.184412 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:34:31 crc kubenswrapper[4697]: I0220 16:34:31.184511 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:34:31 crc kubenswrapper[4697]: I0220 16:34:31.497156 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:34:37 crc kubenswrapper[4697]: E0220 16:34:37.229235 4697 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 20 16:34:37 crc kubenswrapper[4697]: E0220 16:34:37.230324 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jqxrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-j7c4d_openshift-marketplace(b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 16:34:37 crc kubenswrapper[4697]: E0220 16:34:37.231764 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-j7c4d" podUID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" Feb 20 16:34:37 crc kubenswrapper[4697]: E0220 16:34:37.241084 4697 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 20 16:34:37 crc kubenswrapper[4697]: E0220 16:34:37.241253 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mxx9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7b742_openshift-marketplace(a400c2f0-eb54-4938-a0b1-fa91e1d18ff3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 16:34:37 crc kubenswrapper[4697]: E0220 16:34:37.243035 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7b742" podUID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" Feb 20 16:34:37 crc kubenswrapper[4697]: I0220 16:34:37.617077 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k68tp" event={"ID":"efb428f3-f886-42e8-91cc-fba15966023c","Type":"ContainerStarted","Data":"ab4b6c9c9ebbaaae8c9a3fbd81bc3e343d742b162a7ae9b5bffd21ef6a2fb968"} Feb 20 16:34:37 crc kubenswrapper[4697]: I0220 16:34:37.628516 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zh8s" event={"ID":"a54aa135-a749-4f61-8bc7-bfea1dbd70dd","Type":"ContainerStarted","Data":"2bf8657f5a02542c6f29ec1668ac422d17969f3d108ac1821a41526040959380"} Feb 20 16:34:37 crc kubenswrapper[4697]: I0220 16:34:37.631335 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmfdq" event={"ID":"7d026728-bdc4-4f2e-8162-fca5d1ba93ca","Type":"ContainerStarted","Data":"f87e1fa95cc7d71a1c7a26576c5f14b286a3cd34be3d29b94dcd9e64ab855979"} Feb 20 16:34:37 crc kubenswrapper[4697]: I0220 16:34:37.631363 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nskrw"] Feb 20 16:34:37 crc kubenswrapper[4697]: I0220 16:34:37.633854 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcxf" event={"ID":"7e335707-03f3-4791-85c7-95134150dc71","Type":"ContainerStarted","Data":"f30e40dfd9ef2c444770096c127233d04b5380fde7e5fe1e5110fe8a8cc9fa41"} Feb 20 16:34:37 crc kubenswrapper[4697]: I0220 16:34:37.637120 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kb7jz" event={"ID":"e9652ea9-a08a-4700-9a77-00b227062e91","Type":"ContainerStarted","Data":"18125e2fb418f9dfd86b4a5fabddddb55c7d6debb8ced142a59fa4b1e5cb1fa3"} Feb 20 16:34:37 crc kubenswrapper[4697]: I0220 16:34:37.642402 4697 generic.go:334] "Generic (PLEG): container finished" podID="aa8bd21f-2da9-4ece-917f-091ada68a2bd" containerID="624197d830ca5b583633b86caaa0b2d582a10afd1fc894758a6fb35e30dd1e12" exitCode=0 Feb 20 16:34:37 crc kubenswrapper[4697]: I0220 16:34:37.643208 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5bghw" event={"ID":"aa8bd21f-2da9-4ece-917f-091ada68a2bd","Type":"ContainerDied","Data":"624197d830ca5b583633b86caaa0b2d582a10afd1fc894758a6fb35e30dd1e12"} Feb 20 16:34:37 crc kubenswrapper[4697]: E0220 16:34:37.648380 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-j7c4d" podUID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" Feb 20 16:34:37 crc kubenswrapper[4697]: E0220 16:34:37.649474 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7b742" podUID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" Feb 20 16:34:37 crc kubenswrapper[4697]: W0220 16:34:37.703167 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aff33f1_a871_41df_a6f1_fd7146e23a9c.slice/crio-f4a6b449948cc18e29cd69291c64dd6f676e01d89524a7663e43bc52b2b11400 WatchSource:0}: Error finding container f4a6b449948cc18e29cd69291c64dd6f676e01d89524a7663e43bc52b2b11400: Status 404 returned error can't find the container with id f4a6b449948cc18e29cd69291c64dd6f676e01d89524a7663e43bc52b2b11400 Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.655603 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5bghw" event={"ID":"aa8bd21f-2da9-4ece-917f-091ada68a2bd","Type":"ContainerStarted","Data":"542d55895b6d3b6963278b0d53fec392be67737fe5e6919aa5b14cb7061cfaa2"} Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.658841 4697 generic.go:334] "Generic (PLEG): container finished" podID="efb428f3-f886-42e8-91cc-fba15966023c" containerID="ab4b6c9c9ebbaaae8c9a3fbd81bc3e343d742b162a7ae9b5bffd21ef6a2fb968" exitCode=0 Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.658964 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k68tp" event={"ID":"efb428f3-f886-42e8-91cc-fba15966023c","Type":"ContainerDied","Data":"ab4b6c9c9ebbaaae8c9a3fbd81bc3e343d742b162a7ae9b5bffd21ef6a2fb968"} Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.665032 4697 generic.go:334] "Generic (PLEG): container finished" podID="7d026728-bdc4-4f2e-8162-fca5d1ba93ca" containerID="f87e1fa95cc7d71a1c7a26576c5f14b286a3cd34be3d29b94dcd9e64ab855979" exitCode=0 Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.665154 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmfdq" event={"ID":"7d026728-bdc4-4f2e-8162-fca5d1ba93ca","Type":"ContainerDied","Data":"f87e1fa95cc7d71a1c7a26576c5f14b286a3cd34be3d29b94dcd9e64ab855979"} Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.665195 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmfdq" event={"ID":"7d026728-bdc4-4f2e-8162-fca5d1ba93ca","Type":"ContainerStarted","Data":"4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5"} Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.667899 4697 generic.go:334] "Generic (PLEG): container finished" podID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" containerID="2bf8657f5a02542c6f29ec1668ac422d17969f3d108ac1821a41526040959380" exitCode=0 Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.668221 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zh8s" event={"ID":"a54aa135-a749-4f61-8bc7-bfea1dbd70dd","Type":"ContainerDied","Data":"2bf8657f5a02542c6f29ec1668ac422d17969f3d108ac1821a41526040959380"} Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.671188 4697 generic.go:334] "Generic (PLEG): container finished" podID="7e335707-03f3-4791-85c7-95134150dc71" containerID="f30e40dfd9ef2c444770096c127233d04b5380fde7e5fe1e5110fe8a8cc9fa41" exitCode=0 Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.671255 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcxf" event={"ID":"7e335707-03f3-4791-85c7-95134150dc71","Type":"ContainerDied","Data":"f30e40dfd9ef2c444770096c127233d04b5380fde7e5fe1e5110fe8a8cc9fa41"} Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.677179 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nskrw" event={"ID":"0aff33f1-a871-41df-a6f1-fd7146e23a9c","Type":"ContainerStarted","Data":"a7f695dff066e6405cd975c2195d95dacdbb63f7bb39e808be930d24fd5be665"} Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.677210 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nskrw" event={"ID":"0aff33f1-a871-41df-a6f1-fd7146e23a9c","Type":"ContainerStarted","Data":"579ab9fcef00fa473765ce9826b8c50ce2a549c8825a34bc22d763c6bd7d4d73"} Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.677222 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nskrw" event={"ID":"0aff33f1-a871-41df-a6f1-fd7146e23a9c","Type":"ContainerStarted","Data":"f4a6b449948cc18e29cd69291c64dd6f676e01d89524a7663e43bc52b2b11400"} Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.680839 4697 generic.go:334] "Generic (PLEG): container finished" podID="e9652ea9-a08a-4700-9a77-00b227062e91" containerID="18125e2fb418f9dfd86b4a5fabddddb55c7d6debb8ced142a59fa4b1e5cb1fa3" exitCode=0 Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.680903 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kb7jz" event={"ID":"e9652ea9-a08a-4700-9a77-00b227062e91","Type":"ContainerDied","Data":"18125e2fb418f9dfd86b4a5fabddddb55c7d6debb8ced142a59fa4b1e5cb1fa3"} Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.696316 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5bghw" podStartSLOduration=2.9403249259999997 podStartE2EDuration="27.696297212s" podCreationTimestamp="2026-02-20 16:34:11 +0000 UTC" firstStartedPulling="2026-02-20 16:34:13.319010011 +0000 UTC m=+161.099055419" lastFinishedPulling="2026-02-20 16:34:38.074982287 +0000 UTC m=+185.855027705" observedRunningTime="2026-02-20 16:34:38.682971343 +0000 UTC m=+186.463016791" watchObservedRunningTime="2026-02-20 16:34:38.696297212 +0000 UTC m=+186.476342620" Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.714042 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nskrw" podStartSLOduration=163.71401838 podStartE2EDuration="2m43.71401838s" podCreationTimestamp="2026-02-20 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:38.711907853 +0000 UTC m=+186.491953261" watchObservedRunningTime="2026-02-20 16:34:38.71401838 +0000 UTC m=+186.494063808" Feb 20 16:34:38 crc kubenswrapper[4697]: I0220 16:34:38.738114 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rmfdq" podStartSLOduration=3.012781982 podStartE2EDuration="26.73809347s" podCreationTimestamp="2026-02-20 16:34:12 +0000 UTC" firstStartedPulling="2026-02-20 16:34:14.331980352 +0000 UTC m=+162.112025760" lastFinishedPulling="2026-02-20 16:34:38.05729184 +0000 UTC m=+185.837337248" observedRunningTime="2026-02-20 16:34:38.735565722 +0000 UTC m=+186.515611140" watchObservedRunningTime="2026-02-20 16:34:38.73809347 +0000 UTC m=+186.518138878" Feb 20 16:34:39 crc kubenswrapper[4697]: I0220 16:34:39.686955 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zh8s" event={"ID":"a54aa135-a749-4f61-8bc7-bfea1dbd70dd","Type":"ContainerStarted","Data":"150322f414cdd99b3cfea31ef5d9ab39ee374a47f47736ff3bca914459f5a3e0"} Feb 20 16:34:39 crc kubenswrapper[4697]: I0220 16:34:39.689418 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcxf" event={"ID":"7e335707-03f3-4791-85c7-95134150dc71","Type":"ContainerStarted","Data":"ed1f384c339be882e0e0c49f6fd07edbb92250c96c82721663b91857f0db2dba"} Feb 20 16:34:39 crc kubenswrapper[4697]: I0220 16:34:39.693126 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kb7jz" event={"ID":"e9652ea9-a08a-4700-9a77-00b227062e91","Type":"ContainerStarted","Data":"a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f"} Feb 20 16:34:39 crc kubenswrapper[4697]: I0220 16:34:39.695301 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k68tp" event={"ID":"efb428f3-f886-42e8-91cc-fba15966023c","Type":"ContainerStarted","Data":"f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859"} Feb 20 16:34:39 crc kubenswrapper[4697]: I0220 16:34:39.712160 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4zh8s" podStartSLOduration=3.009263837 podStartE2EDuration="26.712140743s" podCreationTimestamp="2026-02-20 16:34:13 +0000 UTC" firstStartedPulling="2026-02-20 16:34:15.439504103 +0000 UTC m=+163.219549511" lastFinishedPulling="2026-02-20 16:34:39.142381009 +0000 UTC m=+186.922426417" observedRunningTime="2026-02-20 16:34:39.710618992 +0000 UTC m=+187.490664400" watchObservedRunningTime="2026-02-20 16:34:39.712140743 +0000 UTC m=+187.492186151" Feb 20 16:34:39 crc kubenswrapper[4697]: I0220 16:34:39.777849 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qjcxf" podStartSLOduration=2.9477977490000002 podStartE2EDuration="30.777826445s" podCreationTimestamp="2026-02-20 16:34:09 +0000 UTC" firstStartedPulling="2026-02-20 16:34:11.239656281 +0000 UTC m=+159.019701689" lastFinishedPulling="2026-02-20 16:34:39.069684977 +0000 UTC m=+186.849730385" observedRunningTime="2026-02-20 16:34:39.759633855 +0000 UTC m=+187.539679253" watchObservedRunningTime="2026-02-20 16:34:39.777826445 +0000 UTC m=+187.557871853" Feb 20 16:34:39 crc kubenswrapper[4697]: I0220 16:34:39.810316 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kb7jz" podStartSLOduration=1.8559705420000001 podStartE2EDuration="29.810291171s" podCreationTimestamp="2026-02-20 16:34:10 +0000 UTC" firstStartedPulling="2026-02-20 16:34:11.229712714 +0000 UTC m=+159.009758122" lastFinishedPulling="2026-02-20 16:34:39.184033343 +0000 UTC m=+186.964078751" observedRunningTime="2026-02-20 16:34:39.807917087 +0000 UTC m=+187.587962495" watchObservedRunningTime="2026-02-20 16:34:39.810291171 +0000 UTC m=+187.590336569" Feb 20 16:34:39 crc kubenswrapper[4697]: I0220 16:34:39.843979 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k68tp" podStartSLOduration=3.018343153 podStartE2EDuration="29.84396031s" podCreationTimestamp="2026-02-20 16:34:10 +0000 UTC" firstStartedPulling="2026-02-20 16:34:12.263403442 +0000 UTC m=+160.043448850" lastFinishedPulling="2026-02-20 16:34:39.089020599 +0000 UTC m=+186.869066007" observedRunningTime="2026-02-20 16:34:39.841990247 +0000 UTC m=+187.622035655" watchObservedRunningTime="2026-02-20 16:34:39.84396031 +0000 UTC m=+187.624005718" Feb 20 16:34:40 crc kubenswrapper[4697]: I0220 16:34:40.330753 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:40 crc kubenswrapper[4697]: I0220 16:34:40.331506 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:40 crc kubenswrapper[4697]: I0220 16:34:40.530538 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:40 crc kubenswrapper[4697]: I0220 16:34:40.530592 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:40 crc kubenswrapper[4697]: I0220 16:34:40.761226 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:40 crc kubenswrapper[4697]: I0220 16:34:40.761285 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:41 crc kubenswrapper[4697]: I0220 16:34:41.468361 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qjcxf" podUID="7e335707-03f3-4791-85c7-95134150dc71" containerName="registry-server" probeResult="failure" output=< Feb 20 16:34:41 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 16:34:41 crc kubenswrapper[4697]: > Feb 20 16:34:41 crc kubenswrapper[4697]: I0220 16:34:41.567811 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-kb7jz" podUID="e9652ea9-a08a-4700-9a77-00b227062e91" containerName="registry-server" probeResult="failure" output=< Feb 20 16:34:41 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 16:34:41 crc kubenswrapper[4697]: > Feb 20 16:34:41 crc kubenswrapper[4697]: I0220 16:34:41.799034 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-k68tp" podUID="efb428f3-f886-42e8-91cc-fba15966023c" containerName="registry-server" probeResult="failure" output=< Feb 20 16:34:41 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 16:34:41 crc kubenswrapper[4697]: > Feb 20 16:34:42 crc kubenswrapper[4697]: I0220 16:34:42.118994 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:42 crc kubenswrapper[4697]: I0220 16:34:42.119050 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:42 crc kubenswrapper[4697]: I0220 16:34:42.173581 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:42 crc kubenswrapper[4697]: I0220 16:34:42.556468 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:42 crc kubenswrapper[4697]: I0220 16:34:42.556758 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:42 crc kubenswrapper[4697]: I0220 16:34:42.599959 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:43 crc kubenswrapper[4697]: I0220 16:34:43.543489 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:43 crc kubenswrapper[4697]: I0220 16:34:43.543527 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:43 crc kubenswrapper[4697]: I0220 16:34:43.780667 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:44 crc kubenswrapper[4697]: I0220 16:34:44.048351 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jskqv" Feb 20 16:34:44 crc kubenswrapper[4697]: I0220 16:34:44.464287 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmfdq"] Feb 20 16:34:44 crc kubenswrapper[4697]: I0220 16:34:44.586669 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4zh8s" podUID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" containerName="registry-server" probeResult="failure" output=< Feb 20 16:34:44 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 16:34:44 crc kubenswrapper[4697]: > Feb 20 16:34:45 crc kubenswrapper[4697]: I0220 16:34:45.727586 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rmfdq" podUID="7d026728-bdc4-4f2e-8162-fca5d1ba93ca" containerName="registry-server" containerID="cri-o://4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5" gracePeriod=2 Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.035355 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.118531 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9k7j\" (UniqueName: \"kubernetes.io/projected/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-kube-api-access-l9k7j\") pod \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\" (UID: \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\") " Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.118599 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-utilities\") pod \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\" (UID: \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\") " Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.118733 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-catalog-content\") pod \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\" (UID: \"7d026728-bdc4-4f2e-8162-fca5d1ba93ca\") " Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.120686 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-utilities" (OuterVolumeSpecName: "utilities") pod "7d026728-bdc4-4f2e-8162-fca5d1ba93ca" (UID: "7d026728-bdc4-4f2e-8162-fca5d1ba93ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.141925 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-kube-api-access-l9k7j" (OuterVolumeSpecName: "kube-api-access-l9k7j") pod "7d026728-bdc4-4f2e-8162-fca5d1ba93ca" (UID: "7d026728-bdc4-4f2e-8162-fca5d1ba93ca"). InnerVolumeSpecName "kube-api-access-l9k7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.145079 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d026728-bdc4-4f2e-8162-fca5d1ba93ca" (UID: "7d026728-bdc4-4f2e-8162-fca5d1ba93ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.220413 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9k7j\" (UniqueName: \"kubernetes.io/projected/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-kube-api-access-l9k7j\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.220499 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.220515 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d026728-bdc4-4f2e-8162-fca5d1ba93ca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.749617 4697 generic.go:334] "Generic (PLEG): container finished" podID="7d026728-bdc4-4f2e-8162-fca5d1ba93ca" containerID="4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5" exitCode=0 Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.749662 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmfdq" event={"ID":"7d026728-bdc4-4f2e-8162-fca5d1ba93ca","Type":"ContainerDied","Data":"4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5"} Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.749989 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmfdq" event={"ID":"7d026728-bdc4-4f2e-8162-fca5d1ba93ca","Type":"ContainerDied","Data":"7bf2640bd70d7b5bab1f93a2f1a2d2240d2e6e543b214f36d877c63a6beca2b5"} Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.749713 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmfdq" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.750020 4697 scope.go:117] "RemoveContainer" containerID="4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.768463 4697 scope.go:117] "RemoveContainer" containerID="f87e1fa95cc7d71a1c7a26576c5f14b286a3cd34be3d29b94dcd9e64ab855979" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.775217 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmfdq"] Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.834850 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmfdq"] Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.844291 4697 scope.go:117] "RemoveContainer" containerID="e2a482bbe74df780275cdf406dc55f7b2ffdb314f2242ef1f458f7b432b9c8f8" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.856550 4697 scope.go:117] "RemoveContainer" containerID="4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5" Feb 20 16:34:46 crc kubenswrapper[4697]: E0220 16:34:46.856868 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5\": container with ID starting with 4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5 not found: ID does not exist" containerID="4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.856917 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5"} err="failed to get container status \"4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5\": rpc error: code = NotFound desc = could not find container \"4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5\": container with ID starting with 4b084c21610516a7e2bf32fecd56f5e74c4d8ab670446f335b1c1af43e8a36c5 not found: ID does not exist" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.856977 4697 scope.go:117] "RemoveContainer" containerID="f87e1fa95cc7d71a1c7a26576c5f14b286a3cd34be3d29b94dcd9e64ab855979" Feb 20 16:34:46 crc kubenswrapper[4697]: E0220 16:34:46.857307 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f87e1fa95cc7d71a1c7a26576c5f14b286a3cd34be3d29b94dcd9e64ab855979\": container with ID starting with f87e1fa95cc7d71a1c7a26576c5f14b286a3cd34be3d29b94dcd9e64ab855979 not found: ID does not exist" containerID="f87e1fa95cc7d71a1c7a26576c5f14b286a3cd34be3d29b94dcd9e64ab855979" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.857331 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f87e1fa95cc7d71a1c7a26576c5f14b286a3cd34be3d29b94dcd9e64ab855979"} err="failed to get container status \"f87e1fa95cc7d71a1c7a26576c5f14b286a3cd34be3d29b94dcd9e64ab855979\": rpc error: code = NotFound desc = could not find container \"f87e1fa95cc7d71a1c7a26576c5f14b286a3cd34be3d29b94dcd9e64ab855979\": container with ID starting with f87e1fa95cc7d71a1c7a26576c5f14b286a3cd34be3d29b94dcd9e64ab855979 not found: ID does not exist" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.857346 4697 scope.go:117] "RemoveContainer" containerID="e2a482bbe74df780275cdf406dc55f7b2ffdb314f2242ef1f458f7b432b9c8f8" Feb 20 16:34:46 crc kubenswrapper[4697]: E0220 16:34:46.857590 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a482bbe74df780275cdf406dc55f7b2ffdb314f2242ef1f458f7b432b9c8f8\": container with ID starting with e2a482bbe74df780275cdf406dc55f7b2ffdb314f2242ef1f458f7b432b9c8f8 not found: ID does not exist" containerID="e2a482bbe74df780275cdf406dc55f7b2ffdb314f2242ef1f458f7b432b9c8f8" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.857612 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a482bbe74df780275cdf406dc55f7b2ffdb314f2242ef1f458f7b432b9c8f8"} err="failed to get container status \"e2a482bbe74df780275cdf406dc55f7b2ffdb314f2242ef1f458f7b432b9c8f8\": rpc error: code = NotFound desc = could not find container \"e2a482bbe74df780275cdf406dc55f7b2ffdb314f2242ef1f458f7b432b9c8f8\": container with ID starting with e2a482bbe74df780275cdf406dc55f7b2ffdb314f2242ef1f458f7b432b9c8f8 not found: ID does not exist" Feb 20 16:34:46 crc kubenswrapper[4697]: I0220 16:34:46.883907 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d026728-bdc4-4f2e-8162-fca5d1ba93ca" path="/var/lib/kubelet/pods/7d026728-bdc4-4f2e-8162-fca5d1ba93ca/volumes" Feb 20 16:34:50 crc kubenswrapper[4697]: I0220 16:34:50.373105 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:50 crc kubenswrapper[4697]: I0220 16:34:50.413650 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:34:50 crc kubenswrapper[4697]: I0220 16:34:50.570286 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:50 crc kubenswrapper[4697]: I0220 16:34:50.606087 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:50 crc kubenswrapper[4697]: I0220 16:34:50.820953 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:50 crc kubenswrapper[4697]: I0220 16:34:50.858059 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.458043 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k68tp"] Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.709981 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 16:34:51 crc kubenswrapper[4697]: E0220 16:34:51.710177 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d026728-bdc4-4f2e-8162-fca5d1ba93ca" containerName="extract-content" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.710189 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d026728-bdc4-4f2e-8162-fca5d1ba93ca" containerName="extract-content" Feb 20 16:34:51 crc kubenswrapper[4697]: E0220 16:34:51.710200 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d026728-bdc4-4f2e-8162-fca5d1ba93ca" containerName="registry-server" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.710206 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d026728-bdc4-4f2e-8162-fca5d1ba93ca" containerName="registry-server" Feb 20 16:34:51 crc kubenswrapper[4697]: E0220 16:34:51.710218 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95906860-29b3-49a2-b903-37b9a5a808a5" containerName="collect-profiles" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.710224 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="95906860-29b3-49a2-b903-37b9a5a808a5" containerName="collect-profiles" Feb 20 16:34:51 crc kubenswrapper[4697]: E0220 16:34:51.710236 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd" containerName="pruner" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.710242 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd" containerName="pruner" Feb 20 16:34:51 crc kubenswrapper[4697]: E0220 16:34:51.710249 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d026728-bdc4-4f2e-8162-fca5d1ba93ca" containerName="extract-utilities" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.710256 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d026728-bdc4-4f2e-8162-fca5d1ba93ca" containerName="extract-utilities" Feb 20 16:34:51 crc kubenswrapper[4697]: E0220 16:34:51.710264 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36704629-b795-40ba-b710-ae563e2fb49b" containerName="pruner" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.710270 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="36704629-b795-40ba-b710-ae563e2fb49b" containerName="pruner" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.710351 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="95906860-29b3-49a2-b903-37b9a5a808a5" containerName="collect-profiles" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.710366 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="36704629-b795-40ba-b710-ae563e2fb49b" containerName="pruner" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.710372 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed07cfd3-4a3a-4bd5-a701-799aeca8b6fd" containerName="pruner" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.710383 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d026728-bdc4-4f2e-8162-fca5d1ba93ca" containerName="registry-server" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.710909 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.713904 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.717473 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.725450 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.797386 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35630db3-c1f5-4781-826d-90a9ccb7d4b9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"35630db3-c1f5-4781-826d-90a9ccb7d4b9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.797795 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35630db3-c1f5-4781-826d-90a9ccb7d4b9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"35630db3-c1f5-4781-826d-90a9ccb7d4b9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.899026 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35630db3-c1f5-4781-826d-90a9ccb7d4b9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"35630db3-c1f5-4781-826d-90a9ccb7d4b9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.899108 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35630db3-c1f5-4781-826d-90a9ccb7d4b9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"35630db3-c1f5-4781-826d-90a9ccb7d4b9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.899790 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35630db3-c1f5-4781-826d-90a9ccb7d4b9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"35630db3-c1f5-4781-826d-90a9ccb7d4b9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 16:34:51 crc kubenswrapper[4697]: I0220 16:34:51.918931 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35630db3-c1f5-4781-826d-90a9ccb7d4b9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"35630db3-c1f5-4781-826d-90a9ccb7d4b9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.029208 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.167328 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.225210 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 16:34:52 crc kubenswrapper[4697]: W0220 16:34:52.231124 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod35630db3_c1f5_4781_826d_90a9ccb7d4b9.slice/crio-11156a9ac89a27a8e5908c6f9379d5083be1c6af674c47b78487732cca26f786 WatchSource:0}: Error finding container 11156a9ac89a27a8e5908c6f9379d5083be1c6af674c47b78487732cca26f786: Status 404 returned error can't find the container with id 11156a9ac89a27a8e5908c6f9379d5083be1c6af674c47b78487732cca26f786 Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.776642 4697 generic.go:334] "Generic (PLEG): container finished" podID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" containerID="ff57adb478e0c2995e1187103d8b359f8a495d1dcef8b25c6d2d007255624c44" exitCode=0 Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.776978 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b742" event={"ID":"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3","Type":"ContainerDied","Data":"ff57adb478e0c2995e1187103d8b359f8a495d1dcef8b25c6d2d007255624c44"} Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.781112 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"35630db3-c1f5-4781-826d-90a9ccb7d4b9","Type":"ContainerStarted","Data":"25c08ef19e763257d392aa0ae569803716bf44368b187e3b509491f60861b051"} Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.781151 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"35630db3-c1f5-4781-826d-90a9ccb7d4b9","Type":"ContainerStarted","Data":"11156a9ac89a27a8e5908c6f9379d5083be1c6af674c47b78487732cca26f786"} Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.783097 4697 generic.go:334] "Generic (PLEG): container finished" podID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" containerID="bce216050989133983c77591a5f9f4477de53615b909ca89bcfef86605b00c4f" exitCode=0 Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.783244 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k68tp" podUID="efb428f3-f886-42e8-91cc-fba15966023c" containerName="registry-server" containerID="cri-o://f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859" gracePeriod=2 Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.783456 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7c4d" event={"ID":"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c","Type":"ContainerDied","Data":"bce216050989133983c77591a5f9f4477de53615b909ca89bcfef86605b00c4f"} Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.816228 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.816210147 podStartE2EDuration="1.816210147s" podCreationTimestamp="2026-02-20 16:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:52.810444752 +0000 UTC m=+200.590490160" watchObservedRunningTime="2026-02-20 16:34:52.816210147 +0000 UTC m=+200.596255555" Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.858647 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kb7jz"] Feb 20 16:34:52 crc kubenswrapper[4697]: I0220 16:34:52.858858 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kb7jz" podUID="e9652ea9-a08a-4700-9a77-00b227062e91" containerName="registry-server" containerID="cri-o://a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f" gracePeriod=2 Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.127862 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.172129 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.215113 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb428f3-f886-42e8-91cc-fba15966023c-utilities\") pod \"efb428f3-f886-42e8-91cc-fba15966023c\" (UID: \"efb428f3-f886-42e8-91cc-fba15966023c\") " Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.215199 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb428f3-f886-42e8-91cc-fba15966023c-catalog-content\") pod \"efb428f3-f886-42e8-91cc-fba15966023c\" (UID: \"efb428f3-f886-42e8-91cc-fba15966023c\") " Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.215226 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hsln\" (UniqueName: \"kubernetes.io/projected/efb428f3-f886-42e8-91cc-fba15966023c-kube-api-access-9hsln\") pod \"efb428f3-f886-42e8-91cc-fba15966023c\" (UID: \"efb428f3-f886-42e8-91cc-fba15966023c\") " Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.216062 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb428f3-f886-42e8-91cc-fba15966023c-utilities" (OuterVolumeSpecName: "utilities") pod "efb428f3-f886-42e8-91cc-fba15966023c" (UID: "efb428f3-f886-42e8-91cc-fba15966023c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.222808 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb428f3-f886-42e8-91cc-fba15966023c-kube-api-access-9hsln" (OuterVolumeSpecName: "kube-api-access-9hsln") pod "efb428f3-f886-42e8-91cc-fba15966023c" (UID: "efb428f3-f886-42e8-91cc-fba15966023c"). InnerVolumeSpecName "kube-api-access-9hsln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.269986 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb428f3-f886-42e8-91cc-fba15966023c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efb428f3-f886-42e8-91cc-fba15966023c" (UID: "efb428f3-f886-42e8-91cc-fba15966023c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.316456 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9652ea9-a08a-4700-9a77-00b227062e91-utilities\") pod \"e9652ea9-a08a-4700-9a77-00b227062e91\" (UID: \"e9652ea9-a08a-4700-9a77-00b227062e91\") " Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.316621 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2trq4\" (UniqueName: \"kubernetes.io/projected/e9652ea9-a08a-4700-9a77-00b227062e91-kube-api-access-2trq4\") pod \"e9652ea9-a08a-4700-9a77-00b227062e91\" (UID: \"e9652ea9-a08a-4700-9a77-00b227062e91\") " Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.316781 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9652ea9-a08a-4700-9a77-00b227062e91-catalog-content\") pod \"e9652ea9-a08a-4700-9a77-00b227062e91\" (UID: \"e9652ea9-a08a-4700-9a77-00b227062e91\") " Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.317139 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9652ea9-a08a-4700-9a77-00b227062e91-utilities" (OuterVolumeSpecName: "utilities") pod "e9652ea9-a08a-4700-9a77-00b227062e91" (UID: "e9652ea9-a08a-4700-9a77-00b227062e91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.317177 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb428f3-f886-42e8-91cc-fba15966023c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.317201 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hsln\" (UniqueName: \"kubernetes.io/projected/efb428f3-f886-42e8-91cc-fba15966023c-kube-api-access-9hsln\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.317212 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb428f3-f886-42e8-91cc-fba15966023c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.319757 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9652ea9-a08a-4700-9a77-00b227062e91-kube-api-access-2trq4" (OuterVolumeSpecName: "kube-api-access-2trq4") pod "e9652ea9-a08a-4700-9a77-00b227062e91" (UID: "e9652ea9-a08a-4700-9a77-00b227062e91"). InnerVolumeSpecName "kube-api-access-2trq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.375951 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9652ea9-a08a-4700-9a77-00b227062e91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9652ea9-a08a-4700-9a77-00b227062e91" (UID: "e9652ea9-a08a-4700-9a77-00b227062e91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.419405 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9652ea9-a08a-4700-9a77-00b227062e91-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.419495 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2trq4\" (UniqueName: \"kubernetes.io/projected/e9652ea9-a08a-4700-9a77-00b227062e91-kube-api-access-2trq4\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.419507 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9652ea9-a08a-4700-9a77-00b227062e91-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.583739 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.625065 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.792541 4697 generic.go:334] "Generic (PLEG): container finished" podID="e9652ea9-a08a-4700-9a77-00b227062e91" containerID="a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f" exitCode=0 Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.792645 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kb7jz" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.792641 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kb7jz" event={"ID":"e9652ea9-a08a-4700-9a77-00b227062e91","Type":"ContainerDied","Data":"a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f"} Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.792718 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kb7jz" event={"ID":"e9652ea9-a08a-4700-9a77-00b227062e91","Type":"ContainerDied","Data":"7ab933464df224c5e64c57860f8b0234ffcbed328803e0b1f4a1f40c3f28eae2"} Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.792754 4697 scope.go:117] "RemoveContainer" containerID="a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.798834 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b742" event={"ID":"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3","Type":"ContainerStarted","Data":"3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f"} Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.802169 4697 generic.go:334] "Generic (PLEG): container finished" podID="35630db3-c1f5-4781-826d-90a9ccb7d4b9" containerID="25c08ef19e763257d392aa0ae569803716bf44368b187e3b509491f60861b051" exitCode=0 Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.802249 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"35630db3-c1f5-4781-826d-90a9ccb7d4b9","Type":"ContainerDied","Data":"25c08ef19e763257d392aa0ae569803716bf44368b187e3b509491f60861b051"} Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.804488 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7c4d" event={"ID":"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c","Type":"ContainerStarted","Data":"d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2"} Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.807241 4697 scope.go:117] "RemoveContainer" containerID="18125e2fb418f9dfd86b4a5fabddddb55c7d6debb8ced142a59fa4b1e5cb1fa3" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.810157 4697 generic.go:334] "Generic (PLEG): container finished" podID="efb428f3-f886-42e8-91cc-fba15966023c" containerID="f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859" exitCode=0 Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.810248 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k68tp" event={"ID":"efb428f3-f886-42e8-91cc-fba15966023c","Type":"ContainerDied","Data":"f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859"} Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.810274 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k68tp" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.810303 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k68tp" event={"ID":"efb428f3-f886-42e8-91cc-fba15966023c","Type":"ContainerDied","Data":"969a8728439fc3786328e92ae3a2cad29914f5d00968e004c38de3ddc2f020b9"} Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.822759 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7b742" podStartSLOduration=2.792587195 podStartE2EDuration="44.822740533s" podCreationTimestamp="2026-02-20 16:34:09 +0000 UTC" firstStartedPulling="2026-02-20 16:34:11.233132087 +0000 UTC m=+159.013177495" lastFinishedPulling="2026-02-20 16:34:53.263285425 +0000 UTC m=+201.043330833" observedRunningTime="2026-02-20 16:34:53.819862791 +0000 UTC m=+201.599908199" watchObservedRunningTime="2026-02-20 16:34:53.822740533 +0000 UTC m=+201.602785941" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.822868 4697 scope.go:117] "RemoveContainer" containerID="298f7d7f37a7452693a57cac7a66e42ee3bebc86d248ac8d173d617950555835" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.843852 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j7c4d" podStartSLOduration=3.100377984 podStartE2EDuration="40.843832747s" podCreationTimestamp="2026-02-20 16:34:13 +0000 UTC" firstStartedPulling="2026-02-20 16:34:15.428696532 +0000 UTC m=+163.208741940" lastFinishedPulling="2026-02-20 16:34:53.172151295 +0000 UTC m=+200.952196703" observedRunningTime="2026-02-20 16:34:53.836760955 +0000 UTC m=+201.616806363" watchObservedRunningTime="2026-02-20 16:34:53.843832747 +0000 UTC m=+201.623878145" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.851578 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kb7jz"] Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.859635 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kb7jz"] Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.866334 4697 scope.go:117] "RemoveContainer" containerID="a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f" Feb 20 16:34:53 crc kubenswrapper[4697]: E0220 16:34:53.866790 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f\": container with ID starting with a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f not found: ID does not exist" containerID="a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.866819 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f"} err="failed to get container status \"a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f\": rpc error: code = NotFound desc = could not find container \"a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f\": container with ID starting with a24cc235aaf0c619b5a826049211302a1cc42044d213d91613ecc4b59672744f not found: ID does not exist" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.866838 4697 scope.go:117] "RemoveContainer" containerID="18125e2fb418f9dfd86b4a5fabddddb55c7d6debb8ced142a59fa4b1e5cb1fa3" Feb 20 16:34:53 crc kubenswrapper[4697]: E0220 16:34:53.867158 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18125e2fb418f9dfd86b4a5fabddddb55c7d6debb8ced142a59fa4b1e5cb1fa3\": container with ID starting with 18125e2fb418f9dfd86b4a5fabddddb55c7d6debb8ced142a59fa4b1e5cb1fa3 not found: ID does not exist" containerID="18125e2fb418f9dfd86b4a5fabddddb55c7d6debb8ced142a59fa4b1e5cb1fa3" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.867174 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18125e2fb418f9dfd86b4a5fabddddb55c7d6debb8ced142a59fa4b1e5cb1fa3"} err="failed to get container status \"18125e2fb418f9dfd86b4a5fabddddb55c7d6debb8ced142a59fa4b1e5cb1fa3\": rpc error: code = NotFound desc = could not find container \"18125e2fb418f9dfd86b4a5fabddddb55c7d6debb8ced142a59fa4b1e5cb1fa3\": container with ID starting with 18125e2fb418f9dfd86b4a5fabddddb55c7d6debb8ced142a59fa4b1e5cb1fa3 not found: ID does not exist" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.867186 4697 scope.go:117] "RemoveContainer" containerID="298f7d7f37a7452693a57cac7a66e42ee3bebc86d248ac8d173d617950555835" Feb 20 16:34:53 crc kubenswrapper[4697]: E0220 16:34:53.867386 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"298f7d7f37a7452693a57cac7a66e42ee3bebc86d248ac8d173d617950555835\": container with ID starting with 298f7d7f37a7452693a57cac7a66e42ee3bebc86d248ac8d173d617950555835 not found: ID does not exist" containerID="298f7d7f37a7452693a57cac7a66e42ee3bebc86d248ac8d173d617950555835" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.867402 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"298f7d7f37a7452693a57cac7a66e42ee3bebc86d248ac8d173d617950555835"} err="failed to get container status \"298f7d7f37a7452693a57cac7a66e42ee3bebc86d248ac8d173d617950555835\": rpc error: code = NotFound desc = could not find container \"298f7d7f37a7452693a57cac7a66e42ee3bebc86d248ac8d173d617950555835\": container with ID starting with 298f7d7f37a7452693a57cac7a66e42ee3bebc86d248ac8d173d617950555835 not found: ID does not exist" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.867418 4697 scope.go:117] "RemoveContainer" containerID="f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.885613 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k68tp"] Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.893171 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k68tp"] Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.901937 4697 scope.go:117] "RemoveContainer" containerID="ab4b6c9c9ebbaaae8c9a3fbd81bc3e343d742b162a7ae9b5bffd21ef6a2fb968" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.938423 4697 scope.go:117] "RemoveContainer" containerID="1d0a2b06285771bd6209a3b3a457added3bd9e3dfa1f623b9cf4c24b49ea0d5d" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.955958 4697 scope.go:117] "RemoveContainer" containerID="f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859" Feb 20 16:34:53 crc kubenswrapper[4697]: E0220 16:34:53.957423 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859\": container with ID starting with f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859 not found: ID does not exist" containerID="f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.957477 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859"} err="failed to get container status \"f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859\": rpc error: code = NotFound desc = could not find container \"f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859\": container with ID starting with f65e09063b6ed8413cca18a8fdd9c3c60ea94fa234ae420f4ee8fa0840c1b859 not found: ID does not exist" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.957501 4697 scope.go:117] "RemoveContainer" containerID="ab4b6c9c9ebbaaae8c9a3fbd81bc3e343d742b162a7ae9b5bffd21ef6a2fb968" Feb 20 16:34:53 crc kubenswrapper[4697]: E0220 16:34:53.957951 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab4b6c9c9ebbaaae8c9a3fbd81bc3e343d742b162a7ae9b5bffd21ef6a2fb968\": container with ID starting with ab4b6c9c9ebbaaae8c9a3fbd81bc3e343d742b162a7ae9b5bffd21ef6a2fb968 not found: ID does not exist" containerID="ab4b6c9c9ebbaaae8c9a3fbd81bc3e343d742b162a7ae9b5bffd21ef6a2fb968" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.958001 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab4b6c9c9ebbaaae8c9a3fbd81bc3e343d742b162a7ae9b5bffd21ef6a2fb968"} err="failed to get container status \"ab4b6c9c9ebbaaae8c9a3fbd81bc3e343d742b162a7ae9b5bffd21ef6a2fb968\": rpc error: code = NotFound desc = could not find container \"ab4b6c9c9ebbaaae8c9a3fbd81bc3e343d742b162a7ae9b5bffd21ef6a2fb968\": container with ID starting with ab4b6c9c9ebbaaae8c9a3fbd81bc3e343d742b162a7ae9b5bffd21ef6a2fb968 not found: ID does not exist" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.958036 4697 scope.go:117] "RemoveContainer" containerID="1d0a2b06285771bd6209a3b3a457added3bd9e3dfa1f623b9cf4c24b49ea0d5d" Feb 20 16:34:53 crc kubenswrapper[4697]: E0220 16:34:53.958318 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d0a2b06285771bd6209a3b3a457added3bd9e3dfa1f623b9cf4c24b49ea0d5d\": container with ID starting with 1d0a2b06285771bd6209a3b3a457added3bd9e3dfa1f623b9cf4c24b49ea0d5d not found: ID does not exist" containerID="1d0a2b06285771bd6209a3b3a457added3bd9e3dfa1f623b9cf4c24b49ea0d5d" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.958363 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d0a2b06285771bd6209a3b3a457added3bd9e3dfa1f623b9cf4c24b49ea0d5d"} err="failed to get container status \"1d0a2b06285771bd6209a3b3a457added3bd9e3dfa1f623b9cf4c24b49ea0d5d\": rpc error: code = NotFound desc = could not find container \"1d0a2b06285771bd6209a3b3a457added3bd9e3dfa1f623b9cf4c24b49ea0d5d\": container with ID starting with 1d0a2b06285771bd6209a3b3a457added3bd9e3dfa1f623b9cf4c24b49ea0d5d not found: ID does not exist" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.975595 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:53 crc kubenswrapper[4697]: I0220 16:34:53.975649 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:34:54 crc kubenswrapper[4697]: I0220 16:34:54.882744 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9652ea9-a08a-4700-9a77-00b227062e91" path="/var/lib/kubelet/pods/e9652ea9-a08a-4700-9a77-00b227062e91/volumes" Feb 20 16:34:54 crc kubenswrapper[4697]: I0220 16:34:54.883351 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb428f3-f886-42e8-91cc-fba15966023c" path="/var/lib/kubelet/pods/efb428f3-f886-42e8-91cc-fba15966023c/volumes" Feb 20 16:34:55 crc kubenswrapper[4697]: I0220 16:34:55.035703 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j7c4d" podUID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" containerName="registry-server" probeResult="failure" output=< Feb 20 16:34:55 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 16:34:55 crc kubenswrapper[4697]: > Feb 20 16:34:55 crc kubenswrapper[4697]: I0220 16:34:55.110093 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 16:34:55 crc kubenswrapper[4697]: I0220 16:34:55.238818 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35630db3-c1f5-4781-826d-90a9ccb7d4b9-kubelet-dir\") pod \"35630db3-c1f5-4781-826d-90a9ccb7d4b9\" (UID: \"35630db3-c1f5-4781-826d-90a9ccb7d4b9\") " Feb 20 16:34:55 crc kubenswrapper[4697]: I0220 16:34:55.238906 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35630db3-c1f5-4781-826d-90a9ccb7d4b9-kube-api-access\") pod \"35630db3-c1f5-4781-826d-90a9ccb7d4b9\" (UID: \"35630db3-c1f5-4781-826d-90a9ccb7d4b9\") " Feb 20 16:34:55 crc kubenswrapper[4697]: I0220 16:34:55.239048 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35630db3-c1f5-4781-826d-90a9ccb7d4b9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "35630db3-c1f5-4781-826d-90a9ccb7d4b9" (UID: "35630db3-c1f5-4781-826d-90a9ccb7d4b9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:34:55 crc kubenswrapper[4697]: I0220 16:34:55.239294 4697 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35630db3-c1f5-4781-826d-90a9ccb7d4b9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:55 crc kubenswrapper[4697]: I0220 16:34:55.243403 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35630db3-c1f5-4781-826d-90a9ccb7d4b9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "35630db3-c1f5-4781-826d-90a9ccb7d4b9" (UID: "35630db3-c1f5-4781-826d-90a9ccb7d4b9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:34:55 crc kubenswrapper[4697]: I0220 16:34:55.340930 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35630db3-c1f5-4781-826d-90a9ccb7d4b9-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 16:34:55 crc kubenswrapper[4697]: I0220 16:34:55.824813 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"35630db3-c1f5-4781-826d-90a9ccb7d4b9","Type":"ContainerDied","Data":"11156a9ac89a27a8e5908c6f9379d5083be1c6af674c47b78487732cca26f786"} Feb 20 16:34:55 crc kubenswrapper[4697]: I0220 16:34:55.824862 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11156a9ac89a27a8e5908c6f9379d5083be1c6af674c47b78487732cca26f786" Feb 20 16:34:55 crc kubenswrapper[4697]: I0220 16:34:55.824890 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.528910 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 16:34:57 crc kubenswrapper[4697]: E0220 16:34:57.529452 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9652ea9-a08a-4700-9a77-00b227062e91" containerName="extract-content" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.529472 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9652ea9-a08a-4700-9a77-00b227062e91" containerName="extract-content" Feb 20 16:34:57 crc kubenswrapper[4697]: E0220 16:34:57.529483 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb428f3-f886-42e8-91cc-fba15966023c" containerName="extract-content" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.529490 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb428f3-f886-42e8-91cc-fba15966023c" containerName="extract-content" Feb 20 16:34:57 crc kubenswrapper[4697]: E0220 16:34:57.529500 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb428f3-f886-42e8-91cc-fba15966023c" containerName="registry-server" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.529509 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb428f3-f886-42e8-91cc-fba15966023c" containerName="registry-server" Feb 20 16:34:57 crc kubenswrapper[4697]: E0220 16:34:57.529520 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9652ea9-a08a-4700-9a77-00b227062e91" containerName="registry-server" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.529528 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9652ea9-a08a-4700-9a77-00b227062e91" containerName="registry-server" Feb 20 16:34:57 crc kubenswrapper[4697]: E0220 16:34:57.529548 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35630db3-c1f5-4781-826d-90a9ccb7d4b9" containerName="pruner" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.529556 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="35630db3-c1f5-4781-826d-90a9ccb7d4b9" containerName="pruner" Feb 20 16:34:57 crc kubenswrapper[4697]: E0220 16:34:57.529569 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb428f3-f886-42e8-91cc-fba15966023c" containerName="extract-utilities" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.529577 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb428f3-f886-42e8-91cc-fba15966023c" containerName="extract-utilities" Feb 20 16:34:57 crc kubenswrapper[4697]: E0220 16:34:57.529592 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9652ea9-a08a-4700-9a77-00b227062e91" containerName="extract-utilities" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.529600 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9652ea9-a08a-4700-9a77-00b227062e91" containerName="extract-utilities" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.529750 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb428f3-f886-42e8-91cc-fba15966023c" containerName="registry-server" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.529766 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="35630db3-c1f5-4781-826d-90a9ccb7d4b9" containerName="pruner" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.529777 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9652ea9-a08a-4700-9a77-00b227062e91" containerName="registry-server" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.530450 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.533741 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.538847 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.538980 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.674335 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.674405 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-kube-api-access\") pod \"installer-9-crc\" (UID: \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.674522 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-var-lock\") pod \"installer-9-crc\" (UID: \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.775260 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.775405 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.775714 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-kube-api-access\") pod \"installer-9-crc\" (UID: \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.775867 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-var-lock\") pod \"installer-9-crc\" (UID: \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.776035 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-var-lock\") pod \"installer-9-crc\" (UID: \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.802817 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-kube-api-access\") pod \"installer-9-crc\" (UID: \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:34:57 crc kubenswrapper[4697]: I0220 16:34:57.851547 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:34:58 crc kubenswrapper[4697]: I0220 16:34:58.074174 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 16:34:58 crc kubenswrapper[4697]: I0220 16:34:58.839311 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b1a228a0-19c1-441c-a35e-7e2bc741dc2d","Type":"ContainerStarted","Data":"18b943a6d32eae2e7ab4ce9d69e027df5674b27a2982f1a74e735898d1cd8ad0"} Feb 20 16:34:58 crc kubenswrapper[4697]: I0220 16:34:58.839769 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b1a228a0-19c1-441c-a35e-7e2bc741dc2d","Type":"ContainerStarted","Data":"13e0a594f2a79b091e87a84bd1b6974e14f0b4bf1e662e365263a265206be9df"} Feb 20 16:34:58 crc kubenswrapper[4697]: I0220 16:34:58.852397 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.8523829840000001 podStartE2EDuration="1.852382984s" podCreationTimestamp="2026-02-20 16:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:34:58.852197559 +0000 UTC m=+206.632242967" watchObservedRunningTime="2026-02-20 16:34:58.852382984 +0000 UTC m=+206.632428392" Feb 20 16:35:00 crc kubenswrapper[4697]: I0220 16:35:00.130636 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7b742" Feb 20 16:35:00 crc kubenswrapper[4697]: I0220 16:35:00.131013 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7b742" Feb 20 16:35:00 crc kubenswrapper[4697]: I0220 16:35:00.170568 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7b742" Feb 20 16:35:00 crc kubenswrapper[4697]: I0220 16:35:00.886288 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7b742" Feb 20 16:35:01 crc kubenswrapper[4697]: I0220 16:35:01.185181 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:35:01 crc kubenswrapper[4697]: I0220 16:35:01.185508 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:35:01 crc kubenswrapper[4697]: I0220 16:35:01.185548 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:35:01 crc kubenswrapper[4697]: I0220 16:35:01.186084 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 16:35:01 crc kubenswrapper[4697]: I0220 16:35:01.186131 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6" gracePeriod=600 Feb 20 16:35:01 crc kubenswrapper[4697]: I0220 16:35:01.856558 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6" exitCode=0 Feb 20 16:35:01 crc kubenswrapper[4697]: I0220 16:35:01.856639 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6"} Feb 20 16:35:01 crc kubenswrapper[4697]: I0220 16:35:01.856887 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"f85fb4955a3d3f0a5802d8ba1386616b2a43a2300dcd5a99ec3f0c4b0ac3114b"} Feb 20 16:35:04 crc kubenswrapper[4697]: I0220 16:35:04.025676 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:35:04 crc kubenswrapper[4697]: I0220 16:35:04.096877 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:35:05 crc kubenswrapper[4697]: I0220 16:35:05.859356 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j7c4d"] Feb 20 16:35:05 crc kubenswrapper[4697]: I0220 16:35:05.881555 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j7c4d" podUID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" containerName="registry-server" containerID="cri-o://d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2" gracePeriod=2 Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.219505 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.292790 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqxrp\" (UniqueName: \"kubernetes.io/projected/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-kube-api-access-jqxrp\") pod \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\" (UID: \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\") " Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.292904 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-utilities\") pod \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\" (UID: \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\") " Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.292950 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-catalog-content\") pod \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\" (UID: \"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c\") " Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.295038 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-utilities" (OuterVolumeSpecName: "utilities") pod "b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" (UID: "b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.299318 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-kube-api-access-jqxrp" (OuterVolumeSpecName: "kube-api-access-jqxrp") pod "b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" (UID: "b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c"). InnerVolumeSpecName "kube-api-access-jqxrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.395048 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqxrp\" (UniqueName: \"kubernetes.io/projected/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-kube-api-access-jqxrp\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.395081 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.411809 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" (UID: "b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.495820 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.898239 4697 generic.go:334] "Generic (PLEG): container finished" podID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" containerID="d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2" exitCode=0 Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.898380 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7c4d" event={"ID":"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c","Type":"ContainerDied","Data":"d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2"} Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.898417 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7c4d" event={"ID":"b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c","Type":"ContainerDied","Data":"365ed9085b0b33aab41edd4c8999d68555a76a510f32d37407a7d2e7450b14a5"} Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.898443 4697 scope.go:117] "RemoveContainer" containerID="d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.898614 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7c4d" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.915282 4697 scope.go:117] "RemoveContainer" containerID="bce216050989133983c77591a5f9f4477de53615b909ca89bcfef86605b00c4f" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.922970 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j7c4d"] Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.925481 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j7c4d"] Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.949708 4697 scope.go:117] "RemoveContainer" containerID="cde1fd3a62bee76a5a86951f56f281122c088f3e9d6f9ebefb3ceffea329ff84" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.964213 4697 scope.go:117] "RemoveContainer" containerID="d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2" Feb 20 16:35:06 crc kubenswrapper[4697]: E0220 16:35:06.965190 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2\": container with ID starting with d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2 not found: ID does not exist" containerID="d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.965227 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2"} err="failed to get container status \"d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2\": rpc error: code = NotFound desc = could not find container \"d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2\": container with ID starting with d730320b101387a8cdb1d632c39384639b4ba6c3542f0fa683ca936094dd93b2 not found: ID does not exist" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.965249 4697 scope.go:117] "RemoveContainer" containerID="bce216050989133983c77591a5f9f4477de53615b909ca89bcfef86605b00c4f" Feb 20 16:35:06 crc kubenswrapper[4697]: E0220 16:35:06.965555 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bce216050989133983c77591a5f9f4477de53615b909ca89bcfef86605b00c4f\": container with ID starting with bce216050989133983c77591a5f9f4477de53615b909ca89bcfef86605b00c4f not found: ID does not exist" containerID="bce216050989133983c77591a5f9f4477de53615b909ca89bcfef86605b00c4f" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.965575 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce216050989133983c77591a5f9f4477de53615b909ca89bcfef86605b00c4f"} err="failed to get container status \"bce216050989133983c77591a5f9f4477de53615b909ca89bcfef86605b00c4f\": rpc error: code = NotFound desc = could not find container \"bce216050989133983c77591a5f9f4477de53615b909ca89bcfef86605b00c4f\": container with ID starting with bce216050989133983c77591a5f9f4477de53615b909ca89bcfef86605b00c4f not found: ID does not exist" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.965589 4697 scope.go:117] "RemoveContainer" containerID="cde1fd3a62bee76a5a86951f56f281122c088f3e9d6f9ebefb3ceffea329ff84" Feb 20 16:35:06 crc kubenswrapper[4697]: E0220 16:35:06.965911 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde1fd3a62bee76a5a86951f56f281122c088f3e9d6f9ebefb3ceffea329ff84\": container with ID starting with cde1fd3a62bee76a5a86951f56f281122c088f3e9d6f9ebefb3ceffea329ff84 not found: ID does not exist" containerID="cde1fd3a62bee76a5a86951f56f281122c088f3e9d6f9ebefb3ceffea329ff84" Feb 20 16:35:06 crc kubenswrapper[4697]: I0220 16:35:06.965936 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde1fd3a62bee76a5a86951f56f281122c088f3e9d6f9ebefb3ceffea329ff84"} err="failed to get container status \"cde1fd3a62bee76a5a86951f56f281122c088f3e9d6f9ebefb3ceffea329ff84\": rpc error: code = NotFound desc = could not find container \"cde1fd3a62bee76a5a86951f56f281122c088f3e9d6f9ebefb3ceffea329ff84\": container with ID starting with cde1fd3a62bee76a5a86951f56f281122c088f3e9d6f9ebefb3ceffea329ff84 not found: ID does not exist" Feb 20 16:35:08 crc kubenswrapper[4697]: I0220 16:35:08.888635 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" path="/var/lib/kubelet/pods/b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c/volumes" Feb 20 16:35:12 crc kubenswrapper[4697]: I0220 16:35:12.816965 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m8rl6"] Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.080664 4697 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 16:35:36 crc kubenswrapper[4697]: E0220 16:35:36.081516 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" containerName="extract-utilities" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.081534 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" containerName="extract-utilities" Feb 20 16:35:36 crc kubenswrapper[4697]: E0220 16:35:36.081546 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" containerName="registry-server" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.081553 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" containerName="registry-server" Feb 20 16:35:36 crc kubenswrapper[4697]: E0220 16:35:36.081578 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" containerName="extract-content" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.081586 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" containerName="extract-content" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.081709 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b38bb570-3ff7-4b0c-be67-9a40ff0b3b1c" containerName="registry-server" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.082098 4697 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.082289 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.082438 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577" gracePeriod=15 Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.082509 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91" gracePeriod=15 Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.082508 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93" gracePeriod=15 Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.082484 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e" gracePeriod=15 Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.083236 4697 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 16:35:36 crc kubenswrapper[4697]: E0220 16:35:36.083607 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.083679 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 16:35:36 crc kubenswrapper[4697]: E0220 16:35:36.083710 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.083720 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 16:35:36 crc kubenswrapper[4697]: E0220 16:35:36.083735 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.083746 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 16:35:36 crc kubenswrapper[4697]: E0220 16:35:36.083757 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.083768 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 16:35:36 crc kubenswrapper[4697]: E0220 16:35:36.083784 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.083793 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 20 16:35:36 crc kubenswrapper[4697]: E0220 16:35:36.083806 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.083815 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.083952 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.083968 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.083982 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.083994 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.084004 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.084018 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 16:35:36 crc kubenswrapper[4697]: E0220 16:35:36.084148 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.084161 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.082605 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7" gracePeriod=15 Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.163529 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.163611 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.163695 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.163733 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.163835 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.163865 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.163899 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.163935 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.178504 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.264799 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.264877 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.264911 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.264982 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.264984 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.265005 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.265028 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.265047 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.265065 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.265088 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.265112 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.265136 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.265155 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.265207 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.265208 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.265215 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: I0220 16:35:36.469178 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:35:36 crc kubenswrapper[4697]: W0220 16:35:36.493496 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-cd7e92e804e4602300aac26cdcdc2b5c202172e291e3153961d4410722fc1ad6 WatchSource:0}: Error finding container cd7e92e804e4602300aac26cdcdc2b5c202172e291e3153961d4410722fc1ad6: Status 404 returned error can't find the container with id cd7e92e804e4602300aac26cdcdc2b5c202172e291e3153961d4410722fc1ad6 Feb 20 16:35:36 crc kubenswrapper[4697]: E0220 16:35:36.496803 4697 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189601aacc07fbaf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 16:35:36.496012207 +0000 UTC m=+244.276057625,LastTimestamp:2026-02-20 16:35:36.496012207 +0000 UTC m=+244.276057625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 16:35:36 crc kubenswrapper[4697]: E0220 16:35:36.545305 4697 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189601aacc07fbaf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 16:35:36.496012207 +0000 UTC m=+244.276057625,LastTimestamp:2026-02-20 16:35:36.496012207 +0000 UTC m=+244.276057625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.073196 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4311e34b31940559ff3b5e4f7b39d555fc91b49e353dba8adc0c95c888a778e5"} Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.073979 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"cd7e92e804e4602300aac26cdcdc2b5c202172e291e3153961d4410722fc1ad6"} Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.074111 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.075654 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.076833 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.077416 4697 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91" exitCode=0 Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.077517 4697 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e" exitCode=0 Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.077591 4697 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93" exitCode=0 Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.077652 4697 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7" exitCode=2 Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.077748 4697 scope.go:117] "RemoveContainer" containerID="421f58e602e3d84d63b35ed40b0afe0326e8517b291336f1b5c32bf1ae5bccae" Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.079440 4697 generic.go:334] "Generic (PLEG): container finished" podID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" containerID="18b943a6d32eae2e7ab4ce9d69e027df5674b27a2982f1a74e735898d1cd8ad0" exitCode=0 Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.079562 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b1a228a0-19c1-441c-a35e-7e2bc741dc2d","Type":"ContainerDied","Data":"18b943a6d32eae2e7ab4ce9d69e027df5674b27a2982f1a74e735898d1cd8ad0"} Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.080114 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.080506 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:37 crc kubenswrapper[4697]: I0220 16:35:37.837530 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" containerName="oauth-openshift" containerID="cri-o://028a8be4b22ef5e98577ff42d1b22e53ccbd46a0fdf70a507102159c01a427ec" gracePeriod=15 Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.087003 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.089744 4697 generic.go:334] "Generic (PLEG): container finished" podID="d919170d-e25f-4a96-9503-edaf4c0c3c51" containerID="028a8be4b22ef5e98577ff42d1b22e53ccbd46a0fdf70a507102159c01a427ec" exitCode=0 Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.089870 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" event={"ID":"d919170d-e25f-4a96-9503-edaf4c0c3c51","Type":"ContainerDied","Data":"028a8be4b22ef5e98577ff42d1b22e53ccbd46a0fdf70a507102159c01a427ec"} Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.229579 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.230382 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.230899 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.231223 4697 status_manager.go:851] "Failed to get status for pod" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m8rl6\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.295757 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-session\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.295820 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-ocp-branding-template\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.295851 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-cliconfig\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.295878 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsmkf\" (UniqueName: \"kubernetes.io/projected/d919170d-e25f-4a96-9503-edaf4c0c3c51-kube-api-access-fsmkf\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.295897 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-trusted-ca-bundle\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.295935 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-audit-policies\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.295982 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-error\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.296004 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-router-certs\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.296031 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d919170d-e25f-4a96-9503-edaf4c0c3c51-audit-dir\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.296051 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-idp-0-file-data\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.296071 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-service-ca\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.296087 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-serving-cert\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.296107 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-provider-selection\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.296131 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-login\") pod \"d919170d-e25f-4a96-9503-edaf4c0c3c51\" (UID: \"d919170d-e25f-4a96-9503-edaf4c0c3c51\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.297098 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d919170d-e25f-4a96-9503-edaf4c0c3c51-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.297863 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.298558 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.299588 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.302687 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.310111 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d919170d-e25f-4a96-9503-edaf4c0c3c51-kube-api-access-fsmkf" (OuterVolumeSpecName: "kube-api-access-fsmkf") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "kube-api-access-fsmkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.311483 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.312483 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.313261 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.313658 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.313664 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.313843 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.313938 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.323077 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d919170d-e25f-4a96-9503-edaf4c0c3c51" (UID: "d919170d-e25f-4a96-9503-edaf4c0c3c51"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.361942 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.362603 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.362961 4697 status_manager.go:851] "Failed to get status for pod" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m8rl6\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.363253 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.397769 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-kube-api-access\") pod \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\" (UID: \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398127 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-kubelet-dir\") pod \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\" (UID: \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398180 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-var-lock\") pod \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\" (UID: \"b1a228a0-19c1-441c-a35e-7e2bc741dc2d\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398273 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b1a228a0-19c1-441c-a35e-7e2bc741dc2d" (UID: "b1a228a0-19c1-441c-a35e-7e2bc741dc2d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398414 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-var-lock" (OuterVolumeSpecName: "var-lock") pod "b1a228a0-19c1-441c-a35e-7e2bc741dc2d" (UID: "b1a228a0-19c1-441c-a35e-7e2bc741dc2d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398487 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398507 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398521 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398537 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398553 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398568 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398582 4697 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398595 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398609 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398622 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsmkf\" (UniqueName: \"kubernetes.io/projected/d919170d-e25f-4a96-9503-edaf4c0c3c51-kube-api-access-fsmkf\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398639 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398651 4697 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d919170d-e25f-4a96-9503-edaf4c0c3c51-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398664 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398681 4697 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d919170d-e25f-4a96-9503-edaf4c0c3c51-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.398694 4697 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d919170d-e25f-4a96-9503-edaf4c0c3c51-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.403862 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b1a228a0-19c1-441c-a35e-7e2bc741dc2d" (UID: "b1a228a0-19c1-441c-a35e-7e2bc741dc2d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.460758 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.461766 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.462493 4697 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.462803 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.462970 4697 status_manager.go:851] "Failed to get status for pod" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m8rl6\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.463100 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.499599 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.499645 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.499699 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.499737 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.499775 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.499774 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.500051 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.500063 4697 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.500071 4697 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.500080 4697 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b1a228a0-19c1-441c-a35e-7e2bc741dc2d-var-lock\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.500089 4697 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 16:35:38 crc kubenswrapper[4697]: I0220 16:35:38.886425 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 20 16:35:38 crc kubenswrapper[4697]: E0220 16:35:38.898967 4697 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" volumeName="registry-storage" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.099168 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.100083 4697 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577" exitCode=0 Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.100161 4697 scope.go:117] "RemoveContainer" containerID="8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.100303 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.102619 4697 status_manager.go:851] "Failed to get status for pod" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m8rl6\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.103070 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.103711 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.103732 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b1a228a0-19c1-441c-a35e-7e2bc741dc2d","Type":"ContainerDied","Data":"13e0a594f2a79b091e87a84bd1b6974e14f0b4bf1e662e365263a265206be9df"} Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.103760 4697 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.103785 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13e0a594f2a79b091e87a84bd1b6974e14f0b4bf1e662e365263a265206be9df" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.104333 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.106056 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.107343 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.107381 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" event={"ID":"d919170d-e25f-4a96-9503-edaf4c0c3c51","Type":"ContainerDied","Data":"05fa5c0c04112d2b9169a578d002eecc1e60c8b81911fb2c46587ad365781dda"} Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.107820 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.108265 4697 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.108815 4697 status_manager.go:851] "Failed to get status for pod" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m8rl6\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.111059 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.112528 4697 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.114001 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.114770 4697 status_manager.go:851] "Failed to get status for pod" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m8rl6\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.115084 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.115247 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.115406 4697 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.115868 4697 status_manager.go:851] "Failed to get status for pod" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m8rl6\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.131869 4697 scope.go:117] "RemoveContainer" containerID="98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.156246 4697 scope.go:117] "RemoveContainer" containerID="228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.185875 4697 scope.go:117] "RemoveContainer" containerID="6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.214080 4697 scope.go:117] "RemoveContainer" containerID="6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.238585 4697 scope.go:117] "RemoveContainer" containerID="462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.258297 4697 scope.go:117] "RemoveContainer" containerID="8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91" Feb 20 16:35:39 crc kubenswrapper[4697]: E0220 16:35:39.258780 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\": container with ID starting with 8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91 not found: ID does not exist" containerID="8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.258806 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91"} err="failed to get container status \"8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\": rpc error: code = NotFound desc = could not find container \"8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91\": container with ID starting with 8b2e02ca94693d05001335fad6025971c47b34e80df0b4231c37847543e2bc91 not found: ID does not exist" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.258826 4697 scope.go:117] "RemoveContainer" containerID="98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e" Feb 20 16:35:39 crc kubenswrapper[4697]: E0220 16:35:39.259429 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\": container with ID starting with 98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e not found: ID does not exist" containerID="98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.259551 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e"} err="failed to get container status \"98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\": rpc error: code = NotFound desc = could not find container \"98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e\": container with ID starting with 98bb7c63c4135387e962189bc93e3f5c263370c9d0b4939ba27593dbe207190e not found: ID does not exist" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.259569 4697 scope.go:117] "RemoveContainer" containerID="228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93" Feb 20 16:35:39 crc kubenswrapper[4697]: E0220 16:35:39.260308 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\": container with ID starting with 228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93 not found: ID does not exist" containerID="228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.260334 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93"} err="failed to get container status \"228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\": rpc error: code = NotFound desc = could not find container \"228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93\": container with ID starting with 228fcab2e92caeb23672869f857613403ad3742f91c8f907d3af8d8437a69a93 not found: ID does not exist" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.260348 4697 scope.go:117] "RemoveContainer" containerID="6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7" Feb 20 16:35:39 crc kubenswrapper[4697]: E0220 16:35:39.260891 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\": container with ID starting with 6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7 not found: ID does not exist" containerID="6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.260954 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7"} err="failed to get container status \"6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\": rpc error: code = NotFound desc = could not find container \"6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7\": container with ID starting with 6bda693ade379ad1075c02b76d0e6ca6ba3d88bbf6d90c730e508c75392b56d7 not found: ID does not exist" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.260994 4697 scope.go:117] "RemoveContainer" containerID="6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577" Feb 20 16:35:39 crc kubenswrapper[4697]: E0220 16:35:39.261309 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\": container with ID starting with 6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577 not found: ID does not exist" containerID="6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.261348 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577"} err="failed to get container status \"6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\": rpc error: code = NotFound desc = could not find container \"6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577\": container with ID starting with 6d4607baf48018c3ab03c1981d5a10fb342472ad16547017e499e132948c0577 not found: ID does not exist" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.261366 4697 scope.go:117] "RemoveContainer" containerID="462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5" Feb 20 16:35:39 crc kubenswrapper[4697]: E0220 16:35:39.261547 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\": container with ID starting with 462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5 not found: ID does not exist" containerID="462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.261563 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5"} err="failed to get container status \"462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\": rpc error: code = NotFound desc = could not find container \"462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5\": container with ID starting with 462040168faeb75475e2d61273f18e885c71d09b430deb668720afd8d8a67ba5 not found: ID does not exist" Feb 20 16:35:39 crc kubenswrapper[4697]: I0220 16:35:39.261575 4697 scope.go:117] "RemoveContainer" containerID="028a8be4b22ef5e98577ff42d1b22e53ccbd46a0fdf70a507102159c01a427ec" Feb 20 16:35:41 crc kubenswrapper[4697]: E0220 16:35:41.898054 4697 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:41 crc kubenswrapper[4697]: E0220 16:35:41.899519 4697 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:41 crc kubenswrapper[4697]: E0220 16:35:41.899894 4697 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:41 crc kubenswrapper[4697]: E0220 16:35:41.900405 4697 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:41 crc kubenswrapper[4697]: E0220 16:35:41.900935 4697 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:41 crc kubenswrapper[4697]: I0220 16:35:41.900985 4697 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 20 16:35:41 crc kubenswrapper[4697]: E0220 16:35:41.901373 4697 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="200ms" Feb 20 16:35:42 crc kubenswrapper[4697]: E0220 16:35:42.102650 4697 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="400ms" Feb 20 16:35:42 crc kubenswrapper[4697]: E0220 16:35:42.503944 4697 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="800ms" Feb 20 16:35:42 crc kubenswrapper[4697]: I0220 16:35:42.882694 4697 status_manager.go:851] "Failed to get status for pod" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m8rl6\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:42 crc kubenswrapper[4697]: I0220 16:35:42.883866 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:42 crc kubenswrapper[4697]: I0220 16:35:42.884436 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:43 crc kubenswrapper[4697]: E0220 16:35:43.305533 4697 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="1.6s" Feb 20 16:35:44 crc kubenswrapper[4697]: E0220 16:35:44.906514 4697 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="3.2s" Feb 20 16:35:46 crc kubenswrapper[4697]: E0220 16:35:46.546867 4697 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189601aacc07fbaf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 16:35:36.496012207 +0000 UTC m=+244.276057625,LastTimestamp:2026-02-20 16:35:36.496012207 +0000 UTC m=+244.276057625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 16:35:48 crc kubenswrapper[4697]: E0220 16:35:48.108208 4697 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="6.4s" Feb 20 16:35:49 crc kubenswrapper[4697]: I0220 16:35:49.876890 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:49 crc kubenswrapper[4697]: I0220 16:35:49.877905 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:49 crc kubenswrapper[4697]: I0220 16:35:49.878674 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:49 crc kubenswrapper[4697]: I0220 16:35:49.879049 4697 status_manager.go:851] "Failed to get status for pod" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m8rl6\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:49 crc kubenswrapper[4697]: I0220 16:35:49.892804 4697 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4f59f9e2-de79-485a-b5fd-d4d65365f47f" Feb 20 16:35:49 crc kubenswrapper[4697]: I0220 16:35:49.892856 4697 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4f59f9e2-de79-485a-b5fd-d4d65365f47f" Feb 20 16:35:49 crc kubenswrapper[4697]: E0220 16:35:49.893542 4697 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:49 crc kubenswrapper[4697]: I0220 16:35:49.894300 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:50 crc kubenswrapper[4697]: I0220 16:35:50.182358 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 16:35:50 crc kubenswrapper[4697]: I0220 16:35:50.182665 4697 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a" exitCode=1 Feb 20 16:35:50 crc kubenswrapper[4697]: I0220 16:35:50.182715 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a"} Feb 20 16:35:50 crc kubenswrapper[4697]: I0220 16:35:50.183092 4697 scope.go:117] "RemoveContainer" containerID="46575c94907ad367307eb7855fc25bbe268e087a4ca31ddcdbce7d1496df0a9a" Feb 20 16:35:50 crc kubenswrapper[4697]: I0220 16:35:50.183918 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:50 crc kubenswrapper[4697]: I0220 16:35:50.184259 4697 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:50 crc kubenswrapper[4697]: I0220 16:35:50.184883 4697 status_manager.go:851] "Failed to get status for pod" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m8rl6\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:50 crc kubenswrapper[4697]: I0220 16:35:50.185176 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:50 crc kubenswrapper[4697]: I0220 16:35:50.186999 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0b3dfc36dc991ec53b00b4d442818b9391db58b4e183ab5e472329fbee9d9119"} Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.199758 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.200146 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fb6daf0124a59fc13cb7f7b5daa14518dc262e861c431fba5fed67d32f9211f2"} Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.202549 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.203111 4697 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ee0ce4dfc42fdd762b03d036ca33cb84235fbb60e9289a9255027c9915282407" exitCode=0 Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.203180 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ee0ce4dfc42fdd762b03d036ca33cb84235fbb60e9289a9255027c9915282407"} Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.203480 4697 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.203794 4697 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4f59f9e2-de79-485a-b5fd-d4d65365f47f" Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.203823 4697 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4f59f9e2-de79-485a-b5fd-d4d65365f47f" Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.203903 4697 status_manager.go:851] "Failed to get status for pod" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m8rl6\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:51 crc kubenswrapper[4697]: E0220 16:35:51.204398 4697 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.204425 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.205130 4697 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.205891 4697 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.206422 4697 status_manager.go:851] "Failed to get status for pod" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" pod="openshift-authentication/oauth-openshift-558db77b4-m8rl6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m8rl6\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:51 crc kubenswrapper[4697]: I0220 16:35:51.207812 4697 status_manager.go:851] "Failed to get status for pod" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Feb 20 16:35:52 crc kubenswrapper[4697]: I0220 16:35:52.212898 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bf6ac855daa7cf720c7e9ac8efff967efa81d7685978274f9bf3fc1c6d69ea4a"} Feb 20 16:35:52 crc kubenswrapper[4697]: I0220 16:35:52.213245 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f5e3d9695ca0c0660dbbf63fb8b93f01a260571fc494e65a7f1e3d478b2fba00"} Feb 20 16:35:52 crc kubenswrapper[4697]: I0220 16:35:52.213256 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"10f3622e6a95d0283d310171bf29c78c8d7115afa8abcc4b1728116495c5c7a4"} Feb 20 16:35:52 crc kubenswrapper[4697]: I0220 16:35:52.213267 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bbee8addf0e3f678e20dd268acc2b7b751d168fbcf36cd8b141336a39c0ba7ca"} Feb 20 16:35:53 crc kubenswrapper[4697]: I0220 16:35:53.219415 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0607730f24ac643d28a68b254a870dfff72797fe9af7c93009fbd321cc2360df"} Feb 20 16:35:53 crc kubenswrapper[4697]: I0220 16:35:53.219796 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:53 crc kubenswrapper[4697]: I0220 16:35:53.219681 4697 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4f59f9e2-de79-485a-b5fd-d4d65365f47f" Feb 20 16:35:53 crc kubenswrapper[4697]: I0220 16:35:53.219819 4697 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4f59f9e2-de79-485a-b5fd-d4d65365f47f" Feb 20 16:35:54 crc kubenswrapper[4697]: I0220 16:35:54.895376 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:54 crc kubenswrapper[4697]: I0220 16:35:54.895936 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:54 crc kubenswrapper[4697]: I0220 16:35:54.905714 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:56 crc kubenswrapper[4697]: I0220 16:35:56.086297 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:35:56 crc kubenswrapper[4697]: I0220 16:35:56.672007 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:35:56 crc kubenswrapper[4697]: I0220 16:35:56.678049 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:35:58 crc kubenswrapper[4697]: I0220 16:35:58.227853 4697 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:58 crc kubenswrapper[4697]: I0220 16:35:58.248422 4697 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4f59f9e2-de79-485a-b5fd-d4d65365f47f" Feb 20 16:35:58 crc kubenswrapper[4697]: I0220 16:35:58.248463 4697 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4f59f9e2-de79-485a-b5fd-d4d65365f47f" Feb 20 16:35:58 crc kubenswrapper[4697]: I0220 16:35:58.252076 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:35:58 crc kubenswrapper[4697]: I0220 16:35:58.254072 4697 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="07d6f79b-2474-410b-9aa9-7a5d45803425" Feb 20 16:35:59 crc kubenswrapper[4697]: I0220 16:35:59.254077 4697 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4f59f9e2-de79-485a-b5fd-d4d65365f47f" Feb 20 16:35:59 crc kubenswrapper[4697]: I0220 16:35:59.254120 4697 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4f59f9e2-de79-485a-b5fd-d4d65365f47f" Feb 20 16:36:01 crc kubenswrapper[4697]: I0220 16:36:01.711169 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:36:01 crc kubenswrapper[4697]: I0220 16:36:01.711911 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:36:01 crc kubenswrapper[4697]: I0220 16:36:01.715665 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 20 16:36:01 crc kubenswrapper[4697]: I0220 16:36:01.715834 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 20 16:36:01 crc kubenswrapper[4697]: I0220 16:36:01.724067 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:36:01 crc kubenswrapper[4697]: I0220 16:36:01.733376 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:36:02 crc kubenswrapper[4697]: I0220 16:36:02.001530 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 16:36:02 crc kubenswrapper[4697]: W0220 16:36:02.295015 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-da567048e25fdb4372e90fdd54d71f49287282006d0a0e2ca6ee79e06f22f518 WatchSource:0}: Error finding container da567048e25fdb4372e90fdd54d71f49287282006d0a0e2ca6ee79e06f22f518: Status 404 returned error can't find the container with id da567048e25fdb4372e90fdd54d71f49287282006d0a0e2ca6ee79e06f22f518 Feb 20 16:36:02 crc kubenswrapper[4697]: I0220 16:36:02.910368 4697 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="07d6f79b-2474-410b-9aa9-7a5d45803425" Feb 20 16:36:03 crc kubenswrapper[4697]: I0220 16:36:03.295223 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3a69d2667834239aac75513965cf6ff48f7fc3a471229cd829b08705f7b5e195"} Feb 20 16:36:03 crc kubenswrapper[4697]: I0220 16:36:03.295359 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"da567048e25fdb4372e90fdd54d71f49287282006d0a0e2ca6ee79e06f22f518"} Feb 20 16:36:06 crc kubenswrapper[4697]: I0220 16:36:06.090786 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 16:36:07 crc kubenswrapper[4697]: I0220 16:36:07.743664 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 20 16:36:08 crc kubenswrapper[4697]: I0220 16:36:08.287636 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 20 16:36:08 crc kubenswrapper[4697]: I0220 16:36:08.491905 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 20 16:36:08 crc kubenswrapper[4697]: I0220 16:36:08.657354 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 20 16:36:08 crc kubenswrapper[4697]: I0220 16:36:08.772021 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 20 16:36:08 crc kubenswrapper[4697]: I0220 16:36:08.817611 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 20 16:36:09 crc kubenswrapper[4697]: I0220 16:36:09.116843 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 20 16:36:09 crc kubenswrapper[4697]: I0220 16:36:09.125015 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 16:36:09 crc kubenswrapper[4697]: I0220 16:36:09.366155 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 20 16:36:09 crc kubenswrapper[4697]: I0220 16:36:09.382608 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 20 16:36:09 crc kubenswrapper[4697]: I0220 16:36:09.421000 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 20 16:36:09 crc kubenswrapper[4697]: I0220 16:36:09.479066 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 20 16:36:09 crc kubenswrapper[4697]: I0220 16:36:09.711041 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 20 16:36:09 crc kubenswrapper[4697]: I0220 16:36:09.882626 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 16:36:09 crc kubenswrapper[4697]: I0220 16:36:09.897355 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.333392 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.359234 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.366367 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.450807 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.543692 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.579394 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.588924 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.593215 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.675215 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.694420 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.709109 4697 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.710981 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.798667 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.852408 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 20 16:36:10 crc kubenswrapper[4697]: I0220 16:36:10.943388 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:10.999908 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.194093 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.197489 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.224599 4697 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.318769 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.346278 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.541661 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.554026 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.565730 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.619660 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.771823 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.772333 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.776046 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.816391 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.832895 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.839009 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.950453 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 20 16:36:11 crc kubenswrapper[4697]: I0220 16:36:11.991604 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.168819 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.218947 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.233051 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.259841 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.272282 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.330008 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.452955 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.494652 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.496166 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.523465 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.599501 4697 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.604203 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.604164114 podStartE2EDuration="36.604164114s" podCreationTimestamp="2026-02-20 16:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:35:57.997920425 +0000 UTC m=+265.777965833" watchObservedRunningTime="2026-02-20 16:36:12.604164114 +0000 UTC m=+280.384209562" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.608712 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-m8rl6"] Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.608825 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.612823 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.613124 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.614879 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.630270 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.630250757 podStartE2EDuration="14.630250757s" podCreationTimestamp="2026-02-20 16:35:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:36:12.627384643 +0000 UTC m=+280.407430051" watchObservedRunningTime="2026-02-20 16:36:12.630250757 +0000 UTC m=+280.410296155" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.746326 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.773511 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 20 16:36:12 crc kubenswrapper[4697]: I0220 16:36:12.886821 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" path="/var/lib/kubelet/pods/d919170d-e25f-4a96-9503-edaf4c0c3c51/volumes" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.045698 4697 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.063880 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.065410 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.072836 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.296996 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.356287 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.400910 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.495835 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.508640 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.553142 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.565127 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.567170 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.568929 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.633888 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.655968 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.697474 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.700120 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.829178 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.889380 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.891714 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.901977 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.920379 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.953916 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.954134 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 20 16:36:13 crc kubenswrapper[4697]: I0220 16:36:13.965073 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.009922 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.020845 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.077680 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.285721 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.364480 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.382579 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.435301 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.463151 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.519279 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.532797 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.594678 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.657844 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.798948 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.809177 4697 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.869002 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.895420 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.923180 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.944474 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.947669 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 20 16:36:14 crc kubenswrapper[4697]: I0220 16:36:14.984668 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 20 16:36:15 crc kubenswrapper[4697]: I0220 16:36:15.194066 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 20 16:36:15 crc kubenswrapper[4697]: I0220 16:36:15.340890 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 20 16:36:15 crc kubenswrapper[4697]: I0220 16:36:15.367225 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 20 16:36:15 crc kubenswrapper[4697]: I0220 16:36:15.523108 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 20 16:36:15 crc kubenswrapper[4697]: I0220 16:36:15.637687 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 20 16:36:15 crc kubenswrapper[4697]: I0220 16:36:15.638016 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 20 16:36:15 crc kubenswrapper[4697]: I0220 16:36:15.645801 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 20 16:36:15 crc kubenswrapper[4697]: I0220 16:36:15.664505 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 20 16:36:15 crc kubenswrapper[4697]: I0220 16:36:15.885897 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 20 16:36:15 crc kubenswrapper[4697]: I0220 16:36:15.902146 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 16:36:15 crc kubenswrapper[4697]: I0220 16:36:15.940170 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 20 16:36:15 crc kubenswrapper[4697]: I0220 16:36:15.983920 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.225801 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.232927 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.240717 4697 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.278584 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.306093 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.366319 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.373776 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.440060 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.530508 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.530774 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.575700 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.805870 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.835169 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.846228 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 20 16:36:16 crc kubenswrapper[4697]: I0220 16:36:16.941120 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.021676 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.034334 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.085959 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.096888 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.177940 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.233653 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.432873 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.498978 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.635233 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.642897 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.683843 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.740120 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.792091 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 20 16:36:17 crc kubenswrapper[4697]: I0220 16:36:17.817727 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.015865 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.090303 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.095477 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.175539 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.226604 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.240419 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.283179 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.386483 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.474063 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.492962 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.546774 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.565611 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.577881 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.584742 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.682203 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.720514 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.755698 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.761816 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.773548 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.821335 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.841342 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.921956 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.948227 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 20 16:36:18 crc kubenswrapper[4697]: I0220 16:36:18.952799 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.110040 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.161897 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.244907 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.301800 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.310397 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.313357 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.320060 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.332492 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.583252 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.596195 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.678151 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.732808 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.737406 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 20 16:36:19 crc kubenswrapper[4697]: I0220 16:36:19.792196 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.304647 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.392172 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.402214 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.458877 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.483135 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.543121 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.583417 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.604194 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.632736 4697 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.633187 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://4311e34b31940559ff3b5e4f7b39d555fc91b49e353dba8adc0c95c888a778e5" gracePeriod=5 Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.635966 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.667049 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.694014 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.778942 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.870568 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 20 16:36:20 crc kubenswrapper[4697]: I0220 16:36:20.892634 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 20 16:36:21 crc kubenswrapper[4697]: I0220 16:36:21.158658 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 20 16:36:21 crc kubenswrapper[4697]: I0220 16:36:21.205614 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 20 16:36:21 crc kubenswrapper[4697]: I0220 16:36:21.233466 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 20 16:36:21 crc kubenswrapper[4697]: I0220 16:36:21.284426 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 20 16:36:21 crc kubenswrapper[4697]: I0220 16:36:21.386137 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 20 16:36:21 crc kubenswrapper[4697]: I0220 16:36:21.468403 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 20 16:36:21 crc kubenswrapper[4697]: I0220 16:36:21.588637 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 20 16:36:21 crc kubenswrapper[4697]: I0220 16:36:21.684335 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 20 16:36:21 crc kubenswrapper[4697]: I0220 16:36:21.708890 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 20 16:36:21 crc kubenswrapper[4697]: I0220 16:36:21.741798 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 16:36:21 crc kubenswrapper[4697]: I0220 16:36:21.937395 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 20 16:36:21 crc kubenswrapper[4697]: I0220 16:36:21.964128 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 20 16:36:22 crc kubenswrapper[4697]: I0220 16:36:22.036556 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 20 16:36:22 crc kubenswrapper[4697]: I0220 16:36:22.122522 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 20 16:36:22 crc kubenswrapper[4697]: I0220 16:36:22.199106 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 20 16:36:22 crc kubenswrapper[4697]: I0220 16:36:22.521148 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 16:36:22 crc kubenswrapper[4697]: I0220 16:36:22.670118 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 20 16:36:22 crc kubenswrapper[4697]: I0220 16:36:22.814558 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 20 16:36:22 crc kubenswrapper[4697]: I0220 16:36:22.936666 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 20 16:36:22 crc kubenswrapper[4697]: I0220 16:36:22.979097 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.012145 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q"] Feb 20 16:36:23 crc kubenswrapper[4697]: E0220 16:36:23.012581 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.012613 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 16:36:23 crc kubenswrapper[4697]: E0220 16:36:23.012632 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" containerName="oauth-openshift" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.012650 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" containerName="oauth-openshift" Feb 20 16:36:23 crc kubenswrapper[4697]: E0220 16:36:23.012681 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" containerName="installer" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.012699 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" containerName="installer" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.012917 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="d919170d-e25f-4a96-9503-edaf4c0c3c51" containerName="oauth-openshift" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.012946 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.012969 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a228a0-19c1-441c-a35e-7e2bc741dc2d" containerName="installer" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.013794 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.024569 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.025935 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.028127 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.028276 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.028377 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.028819 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.029751 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.030165 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.030547 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.030601 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.030632 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.030654 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.030805 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.031309 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q"] Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.034333 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.034883 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.053496 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.054851 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-router-certs\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.054961 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7n6c\" (UniqueName: \"kubernetes.io/projected/9fd6b677-7c3b-4b70-b7a6-d68207813bac-kube-api-access-f7n6c\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.058370 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.058558 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9fd6b677-7c3b-4b70-b7a6-d68207813bac-audit-policies\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.058585 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.058761 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-user-template-error\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.058816 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.058849 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-session\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.058892 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-service-ca\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.059062 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-serving-cert\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.059142 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9fd6b677-7c3b-4b70-b7a6-d68207813bac-audit-dir\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.059202 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-cliconfig\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.059300 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-user-template-login\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.059372 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.060614 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.066301 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.161793 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9fd6b677-7c3b-4b70-b7a6-d68207813bac-audit-policies\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.161877 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.161942 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-user-template-error\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.161984 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.162433 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-session\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.162586 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-service-ca\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.162663 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-serving-cert\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.163155 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9fd6b677-7c3b-4b70-b7a6-d68207813bac-audit-dir\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.163202 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-cliconfig\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.163239 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9fd6b677-7c3b-4b70-b7a6-d68207813bac-audit-dir\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.163268 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-user-template-login\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.163294 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.163360 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9fd6b677-7c3b-4b70-b7a6-d68207813bac-audit-policies\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.163394 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-router-certs\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.163422 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7n6c\" (UniqueName: \"kubernetes.io/projected/9fd6b677-7c3b-4b70-b7a6-d68207813bac-kube-api-access-f7n6c\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.163500 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.164012 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-cliconfig\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.164079 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.165080 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-service-ca\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.170902 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.170969 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.171213 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-user-template-error\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.172493 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-serving-cert\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.172536 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-user-template-login\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.172723 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-session\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.173518 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.175119 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9fd6b677-7c3b-4b70-b7a6-d68207813bac-v4-0-config-system-router-certs\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.194128 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7n6c\" (UniqueName: \"kubernetes.io/projected/9fd6b677-7c3b-4b70-b7a6-d68207813bac-kube-api-access-f7n6c\") pod \"oauth-openshift-b6fcd9dcb-h5t2q\" (UID: \"9fd6b677-7c3b-4b70-b7a6-d68207813bac\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.367009 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.399689 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.541477 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.713815 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.871915 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 20 16:36:23 crc kubenswrapper[4697]: I0220 16:36:23.881699 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q"] Feb 20 16:36:24 crc kubenswrapper[4697]: I0220 16:36:24.121033 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 20 16:36:24 crc kubenswrapper[4697]: I0220 16:36:24.136429 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 20 16:36:24 crc kubenswrapper[4697]: I0220 16:36:24.443749 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" event={"ID":"9fd6b677-7c3b-4b70-b7a6-d68207813bac","Type":"ContainerStarted","Data":"dcec40b18c71c7fae21c7516f94f24265cb8461def1368ee8a690a25bf50d10d"} Feb 20 16:36:24 crc kubenswrapper[4697]: I0220 16:36:24.443848 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:24 crc kubenswrapper[4697]: I0220 16:36:24.443928 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" event={"ID":"9fd6b677-7c3b-4b70-b7a6-d68207813bac","Type":"ContainerStarted","Data":"62dc2d794acd378399b6cda7f3ded20360fe0a2492e71c935a9198820acf99f8"} Feb 20 16:36:24 crc kubenswrapper[4697]: I0220 16:36:24.461217 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 16:36:24 crc kubenswrapper[4697]: I0220 16:36:24.478311 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" podStartSLOduration=72.478283346 podStartE2EDuration="1m12.478283346s" podCreationTimestamp="2026-02-20 16:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:36:24.474698023 +0000 UTC m=+292.254743431" watchObservedRunningTime="2026-02-20 16:36:24.478283346 +0000 UTC m=+292.258328814" Feb 20 16:36:24 crc kubenswrapper[4697]: I0220 16:36:24.514097 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-h5t2q" Feb 20 16:36:24 crc kubenswrapper[4697]: I0220 16:36:24.562654 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 16:36:25 crc kubenswrapper[4697]: I0220 16:36:25.208801 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 20 16:36:25 crc kubenswrapper[4697]: I0220 16:36:25.847199 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.202261 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.202335 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300236 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300288 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300332 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300375 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300483 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300483 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300524 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300625 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300700 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300952 4697 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300972 4697 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300981 4697 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.300989 4697 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.308118 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.401694 4697 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.452548 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.452609 4697 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="4311e34b31940559ff3b5e4f7b39d555fc91b49e353dba8adc0c95c888a778e5" exitCode=137 Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.453363 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.453836 4697 scope.go:117] "RemoveContainer" containerID="4311e34b31940559ff3b5e4f7b39d555fc91b49e353dba8adc0c95c888a778e5" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.470547 4697 scope.go:117] "RemoveContainer" containerID="4311e34b31940559ff3b5e4f7b39d555fc91b49e353dba8adc0c95c888a778e5" Feb 20 16:36:26 crc kubenswrapper[4697]: E0220 16:36:26.471036 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4311e34b31940559ff3b5e4f7b39d555fc91b49e353dba8adc0c95c888a778e5\": container with ID starting with 4311e34b31940559ff3b5e4f7b39d555fc91b49e353dba8adc0c95c888a778e5 not found: ID does not exist" containerID="4311e34b31940559ff3b5e4f7b39d555fc91b49e353dba8adc0c95c888a778e5" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.471089 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4311e34b31940559ff3b5e4f7b39d555fc91b49e353dba8adc0c95c888a778e5"} err="failed to get container status \"4311e34b31940559ff3b5e4f7b39d555fc91b49e353dba8adc0c95c888a778e5\": rpc error: code = NotFound desc = could not find container \"4311e34b31940559ff3b5e4f7b39d555fc91b49e353dba8adc0c95c888a778e5\": container with ID starting with 4311e34b31940559ff3b5e4f7b39d555fc91b49e353dba8adc0c95c888a778e5 not found: ID does not exist" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.884778 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.885118 4697 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.893528 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.893567 4697 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5d6c6147-1e7f-4902-9e2f-864e01085145" Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.896731 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 16:36:26 crc kubenswrapper[4697]: I0220 16:36:26.896772 4697 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5d6c6147-1e7f-4902-9e2f-864e01085145" Feb 20 16:36:32 crc kubenswrapper[4697]: I0220 16:36:32.673590 4697 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 20 16:36:41 crc kubenswrapper[4697]: I0220 16:36:41.539571 4697 generic.go:334] "Generic (PLEG): container finished" podID="adf825e0-c430-439f-9d7c-55b7582f1b54" containerID="07441c3fbdabb91968fc2535f0a204ecf1c420b6c747e2e87ae34f0f2a0bcdcc" exitCode=0 Feb 20 16:36:41 crc kubenswrapper[4697]: I0220 16:36:41.539643 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" event={"ID":"adf825e0-c430-439f-9d7c-55b7582f1b54","Type":"ContainerDied","Data":"07441c3fbdabb91968fc2535f0a204ecf1c420b6c747e2e87ae34f0f2a0bcdcc"} Feb 20 16:36:41 crc kubenswrapper[4697]: I0220 16:36:41.541118 4697 scope.go:117] "RemoveContainer" containerID="07441c3fbdabb91968fc2535f0a204ecf1c420b6c747e2e87ae34f0f2a0bcdcc" Feb 20 16:36:42 crc kubenswrapper[4697]: I0220 16:36:42.548518 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" event={"ID":"adf825e0-c430-439f-9d7c-55b7582f1b54","Type":"ContainerStarted","Data":"bac1d2fa294d54ed5e6cd079d5877b9120b43a67a747ad83a6871381cf22b30e"} Feb 20 16:36:42 crc kubenswrapper[4697]: I0220 16:36:42.549525 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:36:42 crc kubenswrapper[4697]: I0220 16:36:42.555069 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:36:46 crc kubenswrapper[4697]: I0220 16:36:46.262517 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 20 16:36:59 crc kubenswrapper[4697]: I0220 16:36:59.564752 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fp67j"] Feb 20 16:36:59 crc kubenswrapper[4697]: I0220 16:36:59.566912 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" podUID="4a4049af-a7dd-47b0-8dea-da1d662031c5" containerName="controller-manager" containerID="cri-o://23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6" gracePeriod=30 Feb 20 16:36:59 crc kubenswrapper[4697]: I0220 16:36:59.660278 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2"] Feb 20 16:36:59 crc kubenswrapper[4697]: I0220 16:36:59.661314 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" podUID="3e44b077-c323-48e8-be50-124f0a01d7d1" containerName="route-controller-manager" containerID="cri-o://ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4" gracePeriod=30 Feb 20 16:36:59 crc kubenswrapper[4697]: I0220 16:36:59.934025 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:36:59 crc kubenswrapper[4697]: I0220 16:36:59.997273 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.079767 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-proxy-ca-bundles\") pod \"4a4049af-a7dd-47b0-8dea-da1d662031c5\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.079840 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-config\") pod \"4a4049af-a7dd-47b0-8dea-da1d662031c5\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.079923 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-client-ca\") pod \"4a4049af-a7dd-47b0-8dea-da1d662031c5\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.079947 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvx84\" (UniqueName: \"kubernetes.io/projected/4a4049af-a7dd-47b0-8dea-da1d662031c5-kube-api-access-lvx84\") pod \"4a4049af-a7dd-47b0-8dea-da1d662031c5\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.079968 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4049af-a7dd-47b0-8dea-da1d662031c5-serving-cert\") pod \"4a4049af-a7dd-47b0-8dea-da1d662031c5\" (UID: \"4a4049af-a7dd-47b0-8dea-da1d662031c5\") " Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.080551 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-config" (OuterVolumeSpecName: "config") pod "4a4049af-a7dd-47b0-8dea-da1d662031c5" (UID: "4a4049af-a7dd-47b0-8dea-da1d662031c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.080543 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4a4049af-a7dd-47b0-8dea-da1d662031c5" (UID: "4a4049af-a7dd-47b0-8dea-da1d662031c5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.080732 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-client-ca" (OuterVolumeSpecName: "client-ca") pod "4a4049af-a7dd-47b0-8dea-da1d662031c5" (UID: "4a4049af-a7dd-47b0-8dea-da1d662031c5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.085892 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a4049af-a7dd-47b0-8dea-da1d662031c5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4a4049af-a7dd-47b0-8dea-da1d662031c5" (UID: "4a4049af-a7dd-47b0-8dea-da1d662031c5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.085934 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a4049af-a7dd-47b0-8dea-da1d662031c5-kube-api-access-lvx84" (OuterVolumeSpecName: "kube-api-access-lvx84") pod "4a4049af-a7dd-47b0-8dea-da1d662031c5" (UID: "4a4049af-a7dd-47b0-8dea-da1d662031c5"). InnerVolumeSpecName "kube-api-access-lvx84". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.181621 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e44b077-c323-48e8-be50-124f0a01d7d1-config\") pod \"3e44b077-c323-48e8-be50-124f0a01d7d1\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.181746 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phqrt\" (UniqueName: \"kubernetes.io/projected/3e44b077-c323-48e8-be50-124f0a01d7d1-kube-api-access-phqrt\") pod \"3e44b077-c323-48e8-be50-124f0a01d7d1\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.181851 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e44b077-c323-48e8-be50-124f0a01d7d1-serving-cert\") pod \"3e44b077-c323-48e8-be50-124f0a01d7d1\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.181890 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e44b077-c323-48e8-be50-124f0a01d7d1-client-ca\") pod \"3e44b077-c323-48e8-be50-124f0a01d7d1\" (UID: \"3e44b077-c323-48e8-be50-124f0a01d7d1\") " Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.182214 4697 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.182237 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvx84\" (UniqueName: \"kubernetes.io/projected/4a4049af-a7dd-47b0-8dea-da1d662031c5-kube-api-access-lvx84\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.182258 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4049af-a7dd-47b0-8dea-da1d662031c5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.182277 4697 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.182294 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a4049af-a7dd-47b0-8dea-da1d662031c5-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.182471 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e44b077-c323-48e8-be50-124f0a01d7d1-config" (OuterVolumeSpecName: "config") pod "3e44b077-c323-48e8-be50-124f0a01d7d1" (UID: "3e44b077-c323-48e8-be50-124f0a01d7d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.183051 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e44b077-c323-48e8-be50-124f0a01d7d1-client-ca" (OuterVolumeSpecName: "client-ca") pod "3e44b077-c323-48e8-be50-124f0a01d7d1" (UID: "3e44b077-c323-48e8-be50-124f0a01d7d1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.185699 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e44b077-c323-48e8-be50-124f0a01d7d1-kube-api-access-phqrt" (OuterVolumeSpecName: "kube-api-access-phqrt") pod "3e44b077-c323-48e8-be50-124f0a01d7d1" (UID: "3e44b077-c323-48e8-be50-124f0a01d7d1"). InnerVolumeSpecName "kube-api-access-phqrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.187774 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e44b077-c323-48e8-be50-124f0a01d7d1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3e44b077-c323-48e8-be50-124f0a01d7d1" (UID: "3e44b077-c323-48e8-be50-124f0a01d7d1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.283399 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phqrt\" (UniqueName: \"kubernetes.io/projected/3e44b077-c323-48e8-be50-124f0a01d7d1-kube-api-access-phqrt\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.283458 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e44b077-c323-48e8-be50-124f0a01d7d1-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.283472 4697 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e44b077-c323-48e8-be50-124f0a01d7d1-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.283486 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e44b077-c323-48e8-be50-124f0a01d7d1-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.658532 4697 generic.go:334] "Generic (PLEG): container finished" podID="3e44b077-c323-48e8-be50-124f0a01d7d1" containerID="ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4" exitCode=0 Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.658628 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" event={"ID":"3e44b077-c323-48e8-be50-124f0a01d7d1","Type":"ContainerDied","Data":"ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4"} Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.658672 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" event={"ID":"3e44b077-c323-48e8-be50-124f0a01d7d1","Type":"ContainerDied","Data":"e025121ed28f88c52cc3ed0323107b6147f50f30bcaf2f7564a0a4c47af309d7"} Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.658702 4697 scope.go:117] "RemoveContainer" containerID="ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.658689 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.662064 4697 generic.go:334] "Generic (PLEG): container finished" podID="4a4049af-a7dd-47b0-8dea-da1d662031c5" containerID="23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6" exitCode=0 Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.662121 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" event={"ID":"4a4049af-a7dd-47b0-8dea-da1d662031c5","Type":"ContainerDied","Data":"23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6"} Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.662168 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" event={"ID":"4a4049af-a7dd-47b0-8dea-da1d662031c5","Type":"ContainerDied","Data":"256c72c8cb5c2b8dbbb0245db2fa23324cf7bc3e4540c382d5aa24b23e7ae736"} Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.662227 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fp67j" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.678560 4697 scope.go:117] "RemoveContainer" containerID="ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4" Feb 20 16:37:00 crc kubenswrapper[4697]: E0220 16:37:00.679112 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4\": container with ID starting with ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4 not found: ID does not exist" containerID="ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.679187 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4"} err="failed to get container status \"ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4\": rpc error: code = NotFound desc = could not find container \"ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4\": container with ID starting with ac17b692f111e3b0ba5effa6bce6c139778fa978576048d1b599d5f9d4668ae4 not found: ID does not exist" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.679231 4697 scope.go:117] "RemoveContainer" containerID="23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.732395 4697 scope.go:117] "RemoveContainer" containerID="23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6" Feb 20 16:37:00 crc kubenswrapper[4697]: E0220 16:37:00.733098 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6\": container with ID starting with 23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6 not found: ID does not exist" containerID="23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.733277 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6"} err="failed to get container status \"23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6\": rpc error: code = NotFound desc = could not find container \"23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6\": container with ID starting with 23a4e06b8fec59b7f7dc04c592d209bb8a5482ee193b112c0537bf80838bbee6 not found: ID does not exist" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.758780 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2"] Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.766349 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qvhc2"] Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.772552 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fp67j"] Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.782143 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fp67j"] Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.890302 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e44b077-c323-48e8-be50-124f0a01d7d1" path="/var/lib/kubelet/pods/3e44b077-c323-48e8-be50-124f0a01d7d1/volumes" Feb 20 16:37:00 crc kubenswrapper[4697]: I0220 16:37:00.891394 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a4049af-a7dd-47b0-8dea-da1d662031c5" path="/var/lib/kubelet/pods/4a4049af-a7dd-47b0-8dea-da1d662031c5/volumes" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.039088 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8dd6874d8-gh4vn"] Feb 20 16:37:01 crc kubenswrapper[4697]: E0220 16:37:01.040900 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a4049af-a7dd-47b0-8dea-da1d662031c5" containerName="controller-manager" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.040948 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a4049af-a7dd-47b0-8dea-da1d662031c5" containerName="controller-manager" Feb 20 16:37:01 crc kubenswrapper[4697]: E0220 16:37:01.041002 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e44b077-c323-48e8-be50-124f0a01d7d1" containerName="route-controller-manager" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.041020 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e44b077-c323-48e8-be50-124f0a01d7d1" containerName="route-controller-manager" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.041238 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e44b077-c323-48e8-be50-124f0a01d7d1" containerName="route-controller-manager" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.041274 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a4049af-a7dd-47b0-8dea-da1d662031c5" containerName="controller-manager" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.042139 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.045092 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.045216 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.046814 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75"] Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.046943 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.047797 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.048100 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.051255 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.053592 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.053705 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.054035 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.054469 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.054486 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.055054 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.055504 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.060885 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.067505 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8dd6874d8-gh4vn"] Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.074231 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75"] Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.095244 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-config\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.095314 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-client-ca\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.095354 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-proxy-ca-bundles\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.095460 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28k2l\" (UniqueName: \"kubernetes.io/projected/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-kube-api-access-28k2l\") pod \"route-controller-manager-76ffc9c5fb-dkk75\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.095499 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-client-ca\") pod \"route-controller-manager-76ffc9c5fb-dkk75\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.095535 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f14216-40db-4e0c-9606-19692694b9b2-serving-cert\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.095572 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfthq\" (UniqueName: \"kubernetes.io/projected/24f14216-40db-4e0c-9606-19692694b9b2-kube-api-access-gfthq\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.095693 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-config\") pod \"route-controller-manager-76ffc9c5fb-dkk75\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.095727 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-serving-cert\") pod \"route-controller-manager-76ffc9c5fb-dkk75\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.185107 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.185213 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.197161 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-config\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.197276 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-client-ca\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.197338 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-proxy-ca-bundles\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.197415 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28k2l\" (UniqueName: \"kubernetes.io/projected/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-kube-api-access-28k2l\") pod \"route-controller-manager-76ffc9c5fb-dkk75\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.197505 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-client-ca\") pod \"route-controller-manager-76ffc9c5fb-dkk75\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.197553 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f14216-40db-4e0c-9606-19692694b9b2-serving-cert\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.197614 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfthq\" (UniqueName: \"kubernetes.io/projected/24f14216-40db-4e0c-9606-19692694b9b2-kube-api-access-gfthq\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.197680 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-serving-cert\") pod \"route-controller-manager-76ffc9c5fb-dkk75\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.197727 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-config\") pod \"route-controller-manager-76ffc9c5fb-dkk75\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.198991 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-client-ca\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.199425 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-client-ca\") pod \"route-controller-manager-76ffc9c5fb-dkk75\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.199542 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-config\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.200662 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-proxy-ca-bundles\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.200750 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-config\") pod \"route-controller-manager-76ffc9c5fb-dkk75\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.204672 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f14216-40db-4e0c-9606-19692694b9b2-serving-cert\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.207272 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-serving-cert\") pod \"route-controller-manager-76ffc9c5fb-dkk75\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.228805 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfthq\" (UniqueName: \"kubernetes.io/projected/24f14216-40db-4e0c-9606-19692694b9b2-kube-api-access-gfthq\") pod \"controller-manager-8dd6874d8-gh4vn\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.245410 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28k2l\" (UniqueName: \"kubernetes.io/projected/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-kube-api-access-28k2l\") pod \"route-controller-manager-76ffc9c5fb-dkk75\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.381738 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.400666 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.600825 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8dd6874d8-gh4vn"] Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.627356 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75"] Feb 20 16:37:01 crc kubenswrapper[4697]: W0220 16:37:01.632806 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e103ea4_7859_4f6a_b2d2_6d75c07fb181.slice/crio-c497ce82edbb1625c395bfb57c5b691486fc6698543f1905dbf538c0f9b0eba8 WatchSource:0}: Error finding container c497ce82edbb1625c395bfb57c5b691486fc6698543f1905dbf538c0f9b0eba8: Status 404 returned error can't find the container with id c497ce82edbb1625c395bfb57c5b691486fc6698543f1905dbf538c0f9b0eba8 Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.667274 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" event={"ID":"2e103ea4-7859-4f6a-b2d2-6d75c07fb181","Type":"ContainerStarted","Data":"c497ce82edbb1625c395bfb57c5b691486fc6698543f1905dbf538c0f9b0eba8"} Feb 20 16:37:01 crc kubenswrapper[4697]: I0220 16:37:01.668709 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" event={"ID":"24f14216-40db-4e0c-9606-19692694b9b2","Type":"ContainerStarted","Data":"5bd2cb42c31bf5855d8733657f70383b2d9db4b95ab61e616b7594154decfc90"} Feb 20 16:37:02 crc kubenswrapper[4697]: I0220 16:37:02.246851 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8dd6874d8-gh4vn"] Feb 20 16:37:02 crc kubenswrapper[4697]: I0220 16:37:02.253959 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75"] Feb 20 16:37:02 crc kubenswrapper[4697]: I0220 16:37:02.678214 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" event={"ID":"2e103ea4-7859-4f6a-b2d2-6d75c07fb181","Type":"ContainerStarted","Data":"c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc"} Feb 20 16:37:02 crc kubenswrapper[4697]: I0220 16:37:02.678487 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:02 crc kubenswrapper[4697]: I0220 16:37:02.680274 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" event={"ID":"24f14216-40db-4e0c-9606-19692694b9b2","Type":"ContainerStarted","Data":"1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3"} Feb 20 16:37:02 crc kubenswrapper[4697]: I0220 16:37:02.680489 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:02 crc kubenswrapper[4697]: I0220 16:37:02.683212 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:02 crc kubenswrapper[4697]: I0220 16:37:02.685791 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:02 crc kubenswrapper[4697]: I0220 16:37:02.701311 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" podStartSLOduration=3.701294583 podStartE2EDuration="3.701294583s" podCreationTimestamp="2026-02-20 16:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:37:02.699106151 +0000 UTC m=+330.479151569" watchObservedRunningTime="2026-02-20 16:37:02.701294583 +0000 UTC m=+330.481339991" Feb 20 16:37:02 crc kubenswrapper[4697]: I0220 16:37:02.735546 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" podStartSLOduration=3.73552388 podStartE2EDuration="3.73552388s" podCreationTimestamp="2026-02-20 16:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:37:02.719239825 +0000 UTC m=+330.499285253" watchObservedRunningTime="2026-02-20 16:37:02.73552388 +0000 UTC m=+330.515569308" Feb 20 16:37:03 crc kubenswrapper[4697]: I0220 16:37:03.686218 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" podUID="2e103ea4-7859-4f6a-b2d2-6d75c07fb181" containerName="route-controller-manager" containerID="cri-o://c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc" gracePeriod=30 Feb 20 16:37:03 crc kubenswrapper[4697]: I0220 16:37:03.686400 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" podUID="24f14216-40db-4e0c-9606-19692694b9b2" containerName="controller-manager" containerID="cri-o://1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3" gracePeriod=30 Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.035569 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.065087 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr"] Feb 20 16:37:04 crc kubenswrapper[4697]: E0220 16:37:04.065300 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e103ea4-7859-4f6a-b2d2-6d75c07fb181" containerName="route-controller-manager" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.065311 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e103ea4-7859-4f6a-b2d2-6d75c07fb181" containerName="route-controller-manager" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.065408 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e103ea4-7859-4f6a-b2d2-6d75c07fb181" containerName="route-controller-manager" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.065794 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.074791 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr"] Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.133684 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-serving-cert\") pod \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.133739 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28k2l\" (UniqueName: \"kubernetes.io/projected/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-kube-api-access-28k2l\") pod \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.133775 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-config\") pod \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.133872 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkcvc\" (UniqueName: \"kubernetes.io/projected/f15db249-be96-4ba6-bac9-ea7c2d68d197-kube-api-access-xkcvc\") pod \"route-controller-manager-864bf79549-sb9sr\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.133905 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15db249-be96-4ba6-bac9-ea7c2d68d197-serving-cert\") pod \"route-controller-manager-864bf79549-sb9sr\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.133994 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15db249-be96-4ba6-bac9-ea7c2d68d197-config\") pod \"route-controller-manager-864bf79549-sb9sr\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.134012 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15db249-be96-4ba6-bac9-ea7c2d68d197-client-ca\") pod \"route-controller-manager-864bf79549-sb9sr\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.134785 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-config" (OuterVolumeSpecName: "config") pod "2e103ea4-7859-4f6a-b2d2-6d75c07fb181" (UID: "2e103ea4-7859-4f6a-b2d2-6d75c07fb181"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.138346 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2e103ea4-7859-4f6a-b2d2-6d75c07fb181" (UID: "2e103ea4-7859-4f6a-b2d2-6d75c07fb181"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.138521 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-kube-api-access-28k2l" (OuterVolumeSpecName: "kube-api-access-28k2l") pod "2e103ea4-7859-4f6a-b2d2-6d75c07fb181" (UID: "2e103ea4-7859-4f6a-b2d2-6d75c07fb181"). InnerVolumeSpecName "kube-api-access-28k2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.147024 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.234780 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f14216-40db-4e0c-9606-19692694b9b2-serving-cert\") pod \"24f14216-40db-4e0c-9606-19692694b9b2\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.234843 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-proxy-ca-bundles\") pod \"24f14216-40db-4e0c-9606-19692694b9b2\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.234877 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-config\") pod \"24f14216-40db-4e0c-9606-19692694b9b2\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.234897 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfthq\" (UniqueName: \"kubernetes.io/projected/24f14216-40db-4e0c-9606-19692694b9b2-kube-api-access-gfthq\") pod \"24f14216-40db-4e0c-9606-19692694b9b2\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.234918 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-client-ca\") pod \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\" (UID: \"2e103ea4-7859-4f6a-b2d2-6d75c07fb181\") " Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.234973 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-client-ca\") pod \"24f14216-40db-4e0c-9606-19692694b9b2\" (UID: \"24f14216-40db-4e0c-9606-19692694b9b2\") " Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.235070 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15db249-be96-4ba6-bac9-ea7c2d68d197-serving-cert\") pod \"route-controller-manager-864bf79549-sb9sr\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.235102 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15db249-be96-4ba6-bac9-ea7c2d68d197-config\") pod \"route-controller-manager-864bf79549-sb9sr\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.235121 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15db249-be96-4ba6-bac9-ea7c2d68d197-client-ca\") pod \"route-controller-manager-864bf79549-sb9sr\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.235179 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkcvc\" (UniqueName: \"kubernetes.io/projected/f15db249-be96-4ba6-bac9-ea7c2d68d197-kube-api-access-xkcvc\") pod \"route-controller-manager-864bf79549-sb9sr\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.235212 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.235224 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28k2l\" (UniqueName: \"kubernetes.io/projected/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-kube-api-access-28k2l\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.235234 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.235470 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-config" (OuterVolumeSpecName: "config") pod "24f14216-40db-4e0c-9606-19692694b9b2" (UID: "24f14216-40db-4e0c-9606-19692694b9b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.235555 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-client-ca" (OuterVolumeSpecName: "client-ca") pod "2e103ea4-7859-4f6a-b2d2-6d75c07fb181" (UID: "2e103ea4-7859-4f6a-b2d2-6d75c07fb181"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.236113 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-client-ca" (OuterVolumeSpecName: "client-ca") pod "24f14216-40db-4e0c-9606-19692694b9b2" (UID: "24f14216-40db-4e0c-9606-19692694b9b2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.236184 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "24f14216-40db-4e0c-9606-19692694b9b2" (UID: "24f14216-40db-4e0c-9606-19692694b9b2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.236181 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15db249-be96-4ba6-bac9-ea7c2d68d197-client-ca\") pod \"route-controller-manager-864bf79549-sb9sr\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.236644 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15db249-be96-4ba6-bac9-ea7c2d68d197-config\") pod \"route-controller-manager-864bf79549-sb9sr\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.238879 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24f14216-40db-4e0c-9606-19692694b9b2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "24f14216-40db-4e0c-9606-19692694b9b2" (UID: "24f14216-40db-4e0c-9606-19692694b9b2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.240337 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15db249-be96-4ba6-bac9-ea7c2d68d197-serving-cert\") pod \"route-controller-manager-864bf79549-sb9sr\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.240525 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f14216-40db-4e0c-9606-19692694b9b2-kube-api-access-gfthq" (OuterVolumeSpecName: "kube-api-access-gfthq") pod "24f14216-40db-4e0c-9606-19692694b9b2" (UID: "24f14216-40db-4e0c-9606-19692694b9b2"). InnerVolumeSpecName "kube-api-access-gfthq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.249830 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkcvc\" (UniqueName: \"kubernetes.io/projected/f15db249-be96-4ba6-bac9-ea7c2d68d197-kube-api-access-xkcvc\") pod \"route-controller-manager-864bf79549-sb9sr\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.335780 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.335814 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfthq\" (UniqueName: \"kubernetes.io/projected/24f14216-40db-4e0c-9606-19692694b9b2-kube-api-access-gfthq\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.335826 4697 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e103ea4-7859-4f6a-b2d2-6d75c07fb181-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.335836 4697 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.335845 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f14216-40db-4e0c-9606-19692694b9b2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.335853 4697 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24f14216-40db-4e0c-9606-19692694b9b2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.384732 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.601618 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr"] Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.692019 4697 generic.go:334] "Generic (PLEG): container finished" podID="2e103ea4-7859-4f6a-b2d2-6d75c07fb181" containerID="c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc" exitCode=0 Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.692076 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.692097 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" event={"ID":"2e103ea4-7859-4f6a-b2d2-6d75c07fb181","Type":"ContainerDied","Data":"c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc"} Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.692164 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75" event={"ID":"2e103ea4-7859-4f6a-b2d2-6d75c07fb181","Type":"ContainerDied","Data":"c497ce82edbb1625c395bfb57c5b691486fc6698543f1905dbf538c0f9b0eba8"} Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.692189 4697 scope.go:117] "RemoveContainer" containerID="c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.694249 4697 generic.go:334] "Generic (PLEG): container finished" podID="24f14216-40db-4e0c-9606-19692694b9b2" containerID="1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3" exitCode=0 Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.694308 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.694310 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" event={"ID":"24f14216-40db-4e0c-9606-19692694b9b2","Type":"ContainerDied","Data":"1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3"} Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.694421 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8dd6874d8-gh4vn" event={"ID":"24f14216-40db-4e0c-9606-19692694b9b2","Type":"ContainerDied","Data":"5bd2cb42c31bf5855d8733657f70383b2d9db4b95ab61e616b7594154decfc90"} Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.695952 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" event={"ID":"f15db249-be96-4ba6-bac9-ea7c2d68d197","Type":"ContainerStarted","Data":"080d4e01297c5a17c7fac449f78973cdbcb2097e24d18271a0652502ee41680b"} Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.708307 4697 scope.go:117] "RemoveContainer" containerID="c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc" Feb 20 16:37:04 crc kubenswrapper[4697]: E0220 16:37:04.708834 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc\": container with ID starting with c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc not found: ID does not exist" containerID="c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.708887 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc"} err="failed to get container status \"c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc\": rpc error: code = NotFound desc = could not find container \"c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc\": container with ID starting with c8fe6a9090f6a4841187061101e49089bd0afbc3f2a3af5bd53f06f05fa68edc not found: ID does not exist" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.708918 4697 scope.go:117] "RemoveContainer" containerID="1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.728477 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8dd6874d8-gh4vn"] Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.733380 4697 scope.go:117] "RemoveContainer" containerID="1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3" Feb 20 16:37:04 crc kubenswrapper[4697]: E0220 16:37:04.733916 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3\": container with ID starting with 1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3 not found: ID does not exist" containerID="1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.733960 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3"} err="failed to get container status \"1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3\": rpc error: code = NotFound desc = could not find container \"1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3\": container with ID starting with 1ad9009f544a1f11afa72ff6675176bf688824edc701d4d1ad2b5e8e0a71b1c3 not found: ID does not exist" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.735770 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8dd6874d8-gh4vn"] Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.739644 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75"] Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.744046 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76ffc9c5fb-dkk75"] Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.888180 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f14216-40db-4e0c-9606-19692694b9b2" path="/var/lib/kubelet/pods/24f14216-40db-4e0c-9606-19692694b9b2/volumes" Feb 20 16:37:04 crc kubenswrapper[4697]: I0220 16:37:04.889805 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e103ea4-7859-4f6a-b2d2-6d75c07fb181" path="/var/lib/kubelet/pods/2e103ea4-7859-4f6a-b2d2-6d75c07fb181/volumes" Feb 20 16:37:05 crc kubenswrapper[4697]: I0220 16:37:05.707896 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" event={"ID":"f15db249-be96-4ba6-bac9-ea7c2d68d197","Type":"ContainerStarted","Data":"f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053"} Feb 20 16:37:05 crc kubenswrapper[4697]: I0220 16:37:05.708188 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:05 crc kubenswrapper[4697]: I0220 16:37:05.712959 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:37:05 crc kubenswrapper[4697]: I0220 16:37:05.727102 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" podStartSLOduration=3.7270868740000003 podStartE2EDuration="3.727086874s" podCreationTimestamp="2026-02-20 16:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:37:05.725847318 +0000 UTC m=+333.505892746" watchObservedRunningTime="2026-02-20 16:37:05.727086874 +0000 UTC m=+333.507132282" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.040009 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c5cccfd57-6tll6"] Feb 20 16:37:07 crc kubenswrapper[4697]: E0220 16:37:07.040589 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f14216-40db-4e0c-9606-19692694b9b2" containerName="controller-manager" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.040605 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f14216-40db-4e0c-9606-19692694b9b2" containerName="controller-manager" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.040754 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f14216-40db-4e0c-9606-19692694b9b2" containerName="controller-manager" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.041145 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.044601 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.045704 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.046513 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.046526 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.046987 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.048347 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.063269 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.069710 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c5cccfd57-6tll6"] Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.071488 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-proxy-ca-bundles\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.071525 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-client-ca\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.071577 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-config\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.071637 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-serving-cert\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.071657 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57n4d\" (UniqueName: \"kubernetes.io/projected/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-kube-api-access-57n4d\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.172296 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-serving-cert\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.172331 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57n4d\" (UniqueName: \"kubernetes.io/projected/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-kube-api-access-57n4d\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.172364 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-proxy-ca-bundles\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.172379 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-client-ca\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.172414 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-config\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.173563 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-client-ca\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.173886 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-config\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.174321 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-proxy-ca-bundles\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.181723 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-serving-cert\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.204697 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57n4d\" (UniqueName: \"kubernetes.io/projected/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-kube-api-access-57n4d\") pod \"controller-manager-7c5cccfd57-6tll6\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.379802 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.607566 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c5cccfd57-6tll6"] Feb 20 16:37:07 crc kubenswrapper[4697]: I0220 16:37:07.723909 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" event={"ID":"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8","Type":"ContainerStarted","Data":"b8f54ed22bb675179d85f8d76bbda80c91a2be374e92767bac21db8b40e8b2aa"} Feb 20 16:37:08 crc kubenswrapper[4697]: I0220 16:37:08.729355 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" event={"ID":"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8","Type":"ContainerStarted","Data":"8ad2027f9675a760cb53da06f4a0fae8cc318213f9c6cdfad85ca9da18b4ad30"} Feb 20 16:37:08 crc kubenswrapper[4697]: I0220 16:37:08.729728 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:08 crc kubenswrapper[4697]: I0220 16:37:08.735761 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:37:08 crc kubenswrapper[4697]: I0220 16:37:08.751807 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" podStartSLOduration=6.751780953 podStartE2EDuration="6.751780953s" podCreationTimestamp="2026-02-20 16:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:37:08.748178931 +0000 UTC m=+336.528224369" watchObservedRunningTime="2026-02-20 16:37:08.751780953 +0000 UTC m=+336.531826381" Feb 20 16:37:31 crc kubenswrapper[4697]: I0220 16:37:31.185008 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:37:31 crc kubenswrapper[4697]: I0220 16:37:31.185754 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.577288 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-577l6"] Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.579015 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.591076 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-577l6"] Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.655869 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.655968 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66ccaa3e-2754-48c8-a87a-779220321b53-bound-sa-token\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.656032 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66ccaa3e-2754-48c8-a87a-779220321b53-trusted-ca\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.656072 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66ccaa3e-2754-48c8-a87a-779220321b53-ca-trust-extracted\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.656098 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66ccaa3e-2754-48c8-a87a-779220321b53-registry-tls\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.656129 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkn79\" (UniqueName: \"kubernetes.io/projected/66ccaa3e-2754-48c8-a87a-779220321b53-kube-api-access-fkn79\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.656164 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66ccaa3e-2754-48c8-a87a-779220321b53-installation-pull-secrets\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.656205 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66ccaa3e-2754-48c8-a87a-779220321b53-registry-certificates\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.674869 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.757756 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66ccaa3e-2754-48c8-a87a-779220321b53-registry-certificates\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.757854 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66ccaa3e-2754-48c8-a87a-779220321b53-bound-sa-token\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.757919 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66ccaa3e-2754-48c8-a87a-779220321b53-trusted-ca\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.757950 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66ccaa3e-2754-48c8-a87a-779220321b53-ca-trust-extracted\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.757987 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66ccaa3e-2754-48c8-a87a-779220321b53-registry-tls\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.758047 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkn79\" (UniqueName: \"kubernetes.io/projected/66ccaa3e-2754-48c8-a87a-779220321b53-kube-api-access-fkn79\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.758089 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66ccaa3e-2754-48c8-a87a-779220321b53-installation-pull-secrets\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.759819 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66ccaa3e-2754-48c8-a87a-779220321b53-ca-trust-extracted\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.760652 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66ccaa3e-2754-48c8-a87a-779220321b53-registry-certificates\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.761508 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66ccaa3e-2754-48c8-a87a-779220321b53-trusted-ca\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.765024 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66ccaa3e-2754-48c8-a87a-779220321b53-registry-tls\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.766130 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66ccaa3e-2754-48c8-a87a-779220321b53-installation-pull-secrets\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.778026 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66ccaa3e-2754-48c8-a87a-779220321b53-bound-sa-token\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.780608 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkn79\" (UniqueName: \"kubernetes.io/projected/66ccaa3e-2754-48c8-a87a-779220321b53-kube-api-access-fkn79\") pod \"image-registry-66df7c8f76-577l6\" (UID: \"66ccaa3e-2754-48c8-a87a-779220321b53\") " pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:43 crc kubenswrapper[4697]: I0220 16:37:43.910746 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:44 crc kubenswrapper[4697]: I0220 16:37:44.300522 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-577l6"] Feb 20 16:37:44 crc kubenswrapper[4697]: I0220 16:37:44.945720 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-577l6" event={"ID":"66ccaa3e-2754-48c8-a87a-779220321b53","Type":"ContainerStarted","Data":"a836f60025601d4b769e0fb3e88e7e52a138bbf295879030cb5dec87264538ce"} Feb 20 16:37:44 crc kubenswrapper[4697]: I0220 16:37:44.946053 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:37:44 crc kubenswrapper[4697]: I0220 16:37:44.946065 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-577l6" event={"ID":"66ccaa3e-2754-48c8-a87a-779220321b53","Type":"ContainerStarted","Data":"68c609e53e17fdb565e8c86c5369377647cd0ec4f99d1eb7a6290d6c32749bb7"} Feb 20 16:37:59 crc kubenswrapper[4697]: I0220 16:37:59.556009 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-577l6" podStartSLOduration=16.55598647 podStartE2EDuration="16.55598647s" podCreationTimestamp="2026-02-20 16:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:37:44.96487824 +0000 UTC m=+372.744923648" watchObservedRunningTime="2026-02-20 16:37:59.55598647 +0000 UTC m=+387.336031878" Feb 20 16:37:59 crc kubenswrapper[4697]: I0220 16:37:59.560986 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c5cccfd57-6tll6"] Feb 20 16:37:59 crc kubenswrapper[4697]: I0220 16:37:59.561284 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" podUID="2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8" containerName="controller-manager" containerID="cri-o://8ad2027f9675a760cb53da06f4a0fae8cc318213f9c6cdfad85ca9da18b4ad30" gracePeriod=30 Feb 20 16:37:59 crc kubenswrapper[4697]: I0220 16:37:59.586119 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr"] Feb 20 16:37:59 crc kubenswrapper[4697]: I0220 16:37:59.586492 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" podUID="f15db249-be96-4ba6-bac9-ea7c2d68d197" containerName="route-controller-manager" containerID="cri-o://f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053" gracePeriod=30 Feb 20 16:37:59 crc kubenswrapper[4697]: I0220 16:37:59.972588 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.027698 4697 generic.go:334] "Generic (PLEG): container finished" podID="2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8" containerID="8ad2027f9675a760cb53da06f4a0fae8cc318213f9c6cdfad85ca9da18b4ad30" exitCode=0 Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.027818 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" event={"ID":"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8","Type":"ContainerDied","Data":"8ad2027f9675a760cb53da06f4a0fae8cc318213f9c6cdfad85ca9da18b4ad30"} Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.028313 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" event={"ID":"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8","Type":"ContainerDied","Data":"b8f54ed22bb675179d85f8d76bbda80c91a2be374e92767bac21db8b40e8b2aa"} Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.028367 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8f54ed22bb675179d85f8d76bbda80c91a2be374e92767bac21db8b40e8b2aa" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.029989 4697 generic.go:334] "Generic (PLEG): container finished" podID="f15db249-be96-4ba6-bac9-ea7c2d68d197" containerID="f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053" exitCode=0 Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.030025 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" event={"ID":"f15db249-be96-4ba6-bac9-ea7c2d68d197","Type":"ContainerDied","Data":"f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053"} Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.030044 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" event={"ID":"f15db249-be96-4ba6-bac9-ea7c2d68d197","Type":"ContainerDied","Data":"080d4e01297c5a17c7fac449f78973cdbcb2097e24d18271a0652502ee41680b"} Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.030062 4697 scope.go:117] "RemoveContainer" containerID="f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.030207 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.034499 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.054606 4697 scope.go:117] "RemoveContainer" containerID="f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053" Feb 20 16:38:00 crc kubenswrapper[4697]: E0220 16:38:00.054989 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053\": container with ID starting with f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053 not found: ID does not exist" containerID="f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.055016 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053"} err="failed to get container status \"f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053\": rpc error: code = NotFound desc = could not find container \"f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053\": container with ID starting with f78b97c5fac50931a34fa097e054f27f6ef0080c32bfc4a25be4ebaf11fa4053 not found: ID does not exist" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.091777 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15db249-be96-4ba6-bac9-ea7c2d68d197-config\") pod \"f15db249-be96-4ba6-bac9-ea7c2d68d197\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.091848 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57n4d\" (UniqueName: \"kubernetes.io/projected/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-kube-api-access-57n4d\") pod \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.091880 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15db249-be96-4ba6-bac9-ea7c2d68d197-serving-cert\") pod \"f15db249-be96-4ba6-bac9-ea7c2d68d197\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.091897 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-proxy-ca-bundles\") pod \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.091917 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-serving-cert\") pod \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.091937 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkcvc\" (UniqueName: \"kubernetes.io/projected/f15db249-be96-4ba6-bac9-ea7c2d68d197-kube-api-access-xkcvc\") pod \"f15db249-be96-4ba6-bac9-ea7c2d68d197\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.091958 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-config\") pod \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.091973 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15db249-be96-4ba6-bac9-ea7c2d68d197-client-ca\") pod \"f15db249-be96-4ba6-bac9-ea7c2d68d197\" (UID: \"f15db249-be96-4ba6-bac9-ea7c2d68d197\") " Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.091998 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-client-ca\") pod \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\" (UID: \"2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8\") " Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.092824 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-client-ca" (OuterVolumeSpecName: "client-ca") pod "2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8" (UID: "2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.093010 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15db249-be96-4ba6-bac9-ea7c2d68d197-config" (OuterVolumeSpecName: "config") pod "f15db249-be96-4ba6-bac9-ea7c2d68d197" (UID: "f15db249-be96-4ba6-bac9-ea7c2d68d197"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.093400 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15db249-be96-4ba6-bac9-ea7c2d68d197-client-ca" (OuterVolumeSpecName: "client-ca") pod "f15db249-be96-4ba6-bac9-ea7c2d68d197" (UID: "f15db249-be96-4ba6-bac9-ea7c2d68d197"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.093505 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-config" (OuterVolumeSpecName: "config") pod "2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8" (UID: "2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.093796 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8" (UID: "2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.097307 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8" (UID: "2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.097392 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15db249-be96-4ba6-bac9-ea7c2d68d197-kube-api-access-xkcvc" (OuterVolumeSpecName: "kube-api-access-xkcvc") pod "f15db249-be96-4ba6-bac9-ea7c2d68d197" (UID: "f15db249-be96-4ba6-bac9-ea7c2d68d197"). InnerVolumeSpecName "kube-api-access-xkcvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.097493 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f15db249-be96-4ba6-bac9-ea7c2d68d197-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f15db249-be96-4ba6-bac9-ea7c2d68d197" (UID: "f15db249-be96-4ba6-bac9-ea7c2d68d197"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.098044 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-kube-api-access-57n4d" (OuterVolumeSpecName: "kube-api-access-57n4d") pod "2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8" (UID: "2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8"). InnerVolumeSpecName "kube-api-access-57n4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.193455 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.193859 4697 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15db249-be96-4ba6-bac9-ea7c2d68d197-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.193943 4697 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.194008 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15db249-be96-4ba6-bac9-ea7c2d68d197-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.194071 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57n4d\" (UniqueName: \"kubernetes.io/projected/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-kube-api-access-57n4d\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.194128 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15db249-be96-4ba6-bac9-ea7c2d68d197-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.194182 4697 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.194241 4697 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.194389 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkcvc\" (UniqueName: \"kubernetes.io/projected/f15db249-be96-4ba6-bac9-ea7c2d68d197-kube-api-access-xkcvc\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.369299 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr"] Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.372355 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864bf79549-sb9sr"] Feb 20 16:38:00 crc kubenswrapper[4697]: I0220 16:38:00.889706 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15db249-be96-4ba6-bac9-ea7c2d68d197" path="/var/lib/kubelet/pods/f15db249-be96-4ba6-bac9-ea7c2d68d197/volumes" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.038409 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c5cccfd57-6tll6" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.059565 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c5cccfd57-6tll6"] Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.063974 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c5cccfd57-6tll6"] Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.072142 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64998b47f4-52szx"] Feb 20 16:38:01 crc kubenswrapper[4697]: E0220 16:38:01.073544 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8" containerName="controller-manager" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.073772 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8" containerName="controller-manager" Feb 20 16:38:01 crc kubenswrapper[4697]: E0220 16:38:01.074486 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f15db249-be96-4ba6-bac9-ea7c2d68d197" containerName="route-controller-manager" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.074508 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15db249-be96-4ba6-bac9-ea7c2d68d197" containerName="route-controller-manager" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.074618 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f15db249-be96-4ba6-bac9-ea7c2d68d197" containerName="route-controller-manager" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.074629 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8" containerName="controller-manager" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.075028 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.077204 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf"] Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.077460 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.077584 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.077605 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.077634 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.077886 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.078532 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.081947 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.085694 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.085809 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.086736 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.086798 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.087252 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.087796 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.092330 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf"] Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.094245 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.095834 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64998b47f4-52szx"] Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.106860 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0e2b1c36-47d0-47c8-9394-40f8e9e90005-proxy-ca-bundles\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.106943 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44z9v\" (UniqueName: \"kubernetes.io/projected/0e2b1c36-47d0-47c8-9394-40f8e9e90005-kube-api-access-44z9v\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.106988 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e2b1c36-47d0-47c8-9394-40f8e9e90005-client-ca\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.107012 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e2b1c36-47d0-47c8-9394-40f8e9e90005-config\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.107031 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e2b1c36-47d0-47c8-9394-40f8e9e90005-serving-cert\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.185154 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.185253 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.185329 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.186263 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f85fb4955a3d3f0a5802d8ba1386616b2a43a2300dcd5a99ec3f0c4b0ac3114b"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.186379 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://f85fb4955a3d3f0a5802d8ba1386616b2a43a2300dcd5a99ec3f0c4b0ac3114b" gracePeriod=600 Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.208223 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44z9v\" (UniqueName: \"kubernetes.io/projected/0e2b1c36-47d0-47c8-9394-40f8e9e90005-kube-api-access-44z9v\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.208274 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62810f99-353d-488f-a4db-d7fee1a1ad7a-client-ca\") pod \"route-controller-manager-8657f6c6c4-fxqkf\" (UID: \"62810f99-353d-488f-a4db-d7fee1a1ad7a\") " pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.208324 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e2b1c36-47d0-47c8-9394-40f8e9e90005-client-ca\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.208352 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xs4x\" (UniqueName: \"kubernetes.io/projected/62810f99-353d-488f-a4db-d7fee1a1ad7a-kube-api-access-2xs4x\") pod \"route-controller-manager-8657f6c6c4-fxqkf\" (UID: \"62810f99-353d-488f-a4db-d7fee1a1ad7a\") " pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.208381 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e2b1c36-47d0-47c8-9394-40f8e9e90005-config\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.208399 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62810f99-353d-488f-a4db-d7fee1a1ad7a-config\") pod \"route-controller-manager-8657f6c6c4-fxqkf\" (UID: \"62810f99-353d-488f-a4db-d7fee1a1ad7a\") " pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.208527 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e2b1c36-47d0-47c8-9394-40f8e9e90005-serving-cert\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.208593 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0e2b1c36-47d0-47c8-9394-40f8e9e90005-proxy-ca-bundles\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.208631 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62810f99-353d-488f-a4db-d7fee1a1ad7a-serving-cert\") pod \"route-controller-manager-8657f6c6c4-fxqkf\" (UID: \"62810f99-353d-488f-a4db-d7fee1a1ad7a\") " pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.210043 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0e2b1c36-47d0-47c8-9394-40f8e9e90005-proxy-ca-bundles\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.210059 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e2b1c36-47d0-47c8-9394-40f8e9e90005-config\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.210328 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e2b1c36-47d0-47c8-9394-40f8e9e90005-client-ca\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.215008 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e2b1c36-47d0-47c8-9394-40f8e9e90005-serving-cert\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.226344 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44z9v\" (UniqueName: \"kubernetes.io/projected/0e2b1c36-47d0-47c8-9394-40f8e9e90005-kube-api-access-44z9v\") pod \"controller-manager-64998b47f4-52szx\" (UID: \"0e2b1c36-47d0-47c8-9394-40f8e9e90005\") " pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.309413 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62810f99-353d-488f-a4db-d7fee1a1ad7a-serving-cert\") pod \"route-controller-manager-8657f6c6c4-fxqkf\" (UID: \"62810f99-353d-488f-a4db-d7fee1a1ad7a\") " pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.309529 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62810f99-353d-488f-a4db-d7fee1a1ad7a-client-ca\") pod \"route-controller-manager-8657f6c6c4-fxqkf\" (UID: \"62810f99-353d-488f-a4db-d7fee1a1ad7a\") " pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.309594 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xs4x\" (UniqueName: \"kubernetes.io/projected/62810f99-353d-488f-a4db-d7fee1a1ad7a-kube-api-access-2xs4x\") pod \"route-controller-manager-8657f6c6c4-fxqkf\" (UID: \"62810f99-353d-488f-a4db-d7fee1a1ad7a\") " pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.310706 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62810f99-353d-488f-a4db-d7fee1a1ad7a-config\") pod \"route-controller-manager-8657f6c6c4-fxqkf\" (UID: \"62810f99-353d-488f-a4db-d7fee1a1ad7a\") " pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.311766 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62810f99-353d-488f-a4db-d7fee1a1ad7a-client-ca\") pod \"route-controller-manager-8657f6c6c4-fxqkf\" (UID: \"62810f99-353d-488f-a4db-d7fee1a1ad7a\") " pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.312228 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62810f99-353d-488f-a4db-d7fee1a1ad7a-config\") pod \"route-controller-manager-8657f6c6c4-fxqkf\" (UID: \"62810f99-353d-488f-a4db-d7fee1a1ad7a\") " pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.316995 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62810f99-353d-488f-a4db-d7fee1a1ad7a-serving-cert\") pod \"route-controller-manager-8657f6c6c4-fxqkf\" (UID: \"62810f99-353d-488f-a4db-d7fee1a1ad7a\") " pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.338026 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xs4x\" (UniqueName: \"kubernetes.io/projected/62810f99-353d-488f-a4db-d7fee1a1ad7a-kube-api-access-2xs4x\") pod \"route-controller-manager-8657f6c6c4-fxqkf\" (UID: \"62810f99-353d-488f-a4db-d7fee1a1ad7a\") " pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.408803 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.427016 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.631174 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf"] Feb 20 16:38:01 crc kubenswrapper[4697]: I0220 16:38:01.686895 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64998b47f4-52szx"] Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.044361 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" event={"ID":"0e2b1c36-47d0-47c8-9394-40f8e9e90005","Type":"ContainerStarted","Data":"c9e598da9933a913821dccbff84055b8be6adf4a86af0008b85434edee64879a"} Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.044739 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.044753 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" event={"ID":"0e2b1c36-47d0-47c8-9394-40f8e9e90005","Type":"ContainerStarted","Data":"9acb31769da4ec01ec5748e9729fce4cb543ae57ac7282256e57e2943aace541"} Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.048354 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="f85fb4955a3d3f0a5802d8ba1386616b2a43a2300dcd5a99ec3f0c4b0ac3114b" exitCode=0 Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.048405 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"f85fb4955a3d3f0a5802d8ba1386616b2a43a2300dcd5a99ec3f0c4b0ac3114b"} Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.048424 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"37b12a08456eb67b5389b54c7c7590b240888403dd45128a2dd330327f7ef7cb"} Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.048457 4697 scope.go:117] "RemoveContainer" containerID="78fe3c2dc6bbd50ae14d4fa9874df1f7a37dc96dd207abd42ffa3a96c2efc0e6" Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.050286 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" event={"ID":"62810f99-353d-488f-a4db-d7fee1a1ad7a","Type":"ContainerStarted","Data":"dfb2fb8e5cf0562decd83a01ead30474cfba5337a8e05cf1b31c3e2975e8729d"} Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.050316 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" event={"ID":"62810f99-353d-488f-a4db-d7fee1a1ad7a","Type":"ContainerStarted","Data":"c034f4f3d7c9225bd29be3bfa585be236ce18be397d0a028c089a6b616787dcf"} Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.050456 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.061929 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.073560 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64998b47f4-52szx" podStartSLOduration=3.073542109 podStartE2EDuration="3.073542109s" podCreationTimestamp="2026-02-20 16:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:38:02.072953893 +0000 UTC m=+389.852999301" watchObservedRunningTime="2026-02-20 16:38:02.073542109 +0000 UTC m=+389.853587517" Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.111452 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" podStartSLOduration=3.11142436 podStartE2EDuration="3.11142436s" podCreationTimestamp="2026-02-20 16:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:38:02.10616168 +0000 UTC m=+389.886207088" watchObservedRunningTime="2026-02-20 16:38:02.11142436 +0000 UTC m=+389.891469768" Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.345257 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8657f6c6c4-fxqkf" Feb 20 16:38:02 crc kubenswrapper[4697]: I0220 16:38:02.884374 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8" path="/var/lib/kubelet/pods/2fc7cbfe-84fd-47f4-ae70-9c7472ab04f8/volumes" Feb 20 16:38:03 crc kubenswrapper[4697]: I0220 16:38:03.917924 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-577l6" Feb 20 16:38:04 crc kubenswrapper[4697]: I0220 16:38:04.006520 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s4bgz"] Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.751607 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjcxf"] Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.752356 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qjcxf" podUID="7e335707-03f3-4791-85c7-95134150dc71" containerName="registry-server" containerID="cri-o://ed1f384c339be882e0e0c49f6fd07edbb92250c96c82721663b91857f0db2dba" gracePeriod=30 Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.763960 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7b742"] Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.770748 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7b742" podUID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" containerName="registry-server" containerID="cri-o://3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f" gracePeriod=30 Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.783889 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m6zk6"] Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.784156 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" podUID="adf825e0-c430-439f-9d7c-55b7582f1b54" containerName="marketplace-operator" containerID="cri-o://bac1d2fa294d54ed5e6cd079d5877b9120b43a67a747ad83a6871381cf22b30e" gracePeriod=30 Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.796577 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5bghw"] Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.796897 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5bghw" podUID="aa8bd21f-2da9-4ece-917f-091ada68a2bd" containerName="registry-server" containerID="cri-o://542d55895b6d3b6963278b0d53fec392be67737fe5e6919aa5b14cb7061cfaa2" gracePeriod=30 Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.802240 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4zh8s"] Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.802532 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4zh8s" podUID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" containerName="registry-server" containerID="cri-o://150322f414cdd99b3cfea31ef5d9ab39ee374a47f47736ff3bca914459f5a3e0" gracePeriod=30 Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.819062 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-shppd"] Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.820010 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.822731 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-shppd"] Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.977241 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jbsm\" (UniqueName: \"kubernetes.io/projected/84dd2186-876f-440f-8187-51f7bdda1bb8-kube-api-access-2jbsm\") pod \"marketplace-operator-79b997595-shppd\" (UID: \"84dd2186-876f-440f-8187-51f7bdda1bb8\") " pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.977288 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84dd2186-876f-440f-8187-51f7bdda1bb8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-shppd\" (UID: \"84dd2186-876f-440f-8187-51f7bdda1bb8\") " pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:12 crc kubenswrapper[4697]: I0220 16:38:12.977348 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84dd2186-876f-440f-8187-51f7bdda1bb8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-shppd\" (UID: \"84dd2186-876f-440f-8187-51f7bdda1bb8\") " pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.078157 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jbsm\" (UniqueName: \"kubernetes.io/projected/84dd2186-876f-440f-8187-51f7bdda1bb8-kube-api-access-2jbsm\") pod \"marketplace-operator-79b997595-shppd\" (UID: \"84dd2186-876f-440f-8187-51f7bdda1bb8\") " pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.078199 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84dd2186-876f-440f-8187-51f7bdda1bb8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-shppd\" (UID: \"84dd2186-876f-440f-8187-51f7bdda1bb8\") " pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.078223 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84dd2186-876f-440f-8187-51f7bdda1bb8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-shppd\" (UID: \"84dd2186-876f-440f-8187-51f7bdda1bb8\") " pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.079345 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84dd2186-876f-440f-8187-51f7bdda1bb8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-shppd\" (UID: \"84dd2186-876f-440f-8187-51f7bdda1bb8\") " pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.097395 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84dd2186-876f-440f-8187-51f7bdda1bb8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-shppd\" (UID: \"84dd2186-876f-440f-8187-51f7bdda1bb8\") " pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.105077 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jbsm\" (UniqueName: \"kubernetes.io/projected/84dd2186-876f-440f-8187-51f7bdda1bb8-kube-api-access-2jbsm\") pod \"marketplace-operator-79b997595-shppd\" (UID: \"84dd2186-876f-440f-8187-51f7bdda1bb8\") " pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.124341 4697 generic.go:334] "Generic (PLEG): container finished" podID="aa8bd21f-2da9-4ece-917f-091ada68a2bd" containerID="542d55895b6d3b6963278b0d53fec392be67737fe5e6919aa5b14cb7061cfaa2" exitCode=0 Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.124444 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5bghw" event={"ID":"aa8bd21f-2da9-4ece-917f-091ada68a2bd","Type":"ContainerDied","Data":"542d55895b6d3b6963278b0d53fec392be67737fe5e6919aa5b14cb7061cfaa2"} Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.128019 4697 generic.go:334] "Generic (PLEG): container finished" podID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" containerID="150322f414cdd99b3cfea31ef5d9ab39ee374a47f47736ff3bca914459f5a3e0" exitCode=0 Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.128077 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zh8s" event={"ID":"a54aa135-a749-4f61-8bc7-bfea1dbd70dd","Type":"ContainerDied","Data":"150322f414cdd99b3cfea31ef5d9ab39ee374a47f47736ff3bca914459f5a3e0"} Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.130129 4697 generic.go:334] "Generic (PLEG): container finished" podID="7e335707-03f3-4791-85c7-95134150dc71" containerID="ed1f384c339be882e0e0c49f6fd07edbb92250c96c82721663b91857f0db2dba" exitCode=0 Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.130157 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcxf" event={"ID":"7e335707-03f3-4791-85c7-95134150dc71","Type":"ContainerDied","Data":"ed1f384c339be882e0e0c49f6fd07edbb92250c96c82721663b91857f0db2dba"} Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.131937 4697 generic.go:334] "Generic (PLEG): container finished" podID="adf825e0-c430-439f-9d7c-55b7582f1b54" containerID="bac1d2fa294d54ed5e6cd079d5877b9120b43a67a747ad83a6871381cf22b30e" exitCode=0 Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.131982 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" event={"ID":"adf825e0-c430-439f-9d7c-55b7582f1b54","Type":"ContainerDied","Data":"bac1d2fa294d54ed5e6cd079d5877b9120b43a67a747ad83a6871381cf22b30e"} Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.132015 4697 scope.go:117] "RemoveContainer" containerID="07441c3fbdabb91968fc2535f0a204ecf1c420b6c747e2e87ae34f0f2a0bcdcc" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.287558 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.377380 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.434359 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.474459 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.483207 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e335707-03f3-4791-85c7-95134150dc71-utilities\") pod \"7e335707-03f3-4791-85c7-95134150dc71\" (UID: \"7e335707-03f3-4791-85c7-95134150dc71\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.483340 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pfjk\" (UniqueName: \"kubernetes.io/projected/7e335707-03f3-4791-85c7-95134150dc71-kube-api-access-9pfjk\") pod \"7e335707-03f3-4791-85c7-95134150dc71\" (UID: \"7e335707-03f3-4791-85c7-95134150dc71\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.483362 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e335707-03f3-4791-85c7-95134150dc71-catalog-content\") pod \"7e335707-03f3-4791-85c7-95134150dc71\" (UID: \"7e335707-03f3-4791-85c7-95134150dc71\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.485073 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e335707-03f3-4791-85c7-95134150dc71-utilities" (OuterVolumeSpecName: "utilities") pod "7e335707-03f3-4791-85c7-95134150dc71" (UID: "7e335707-03f3-4791-85c7-95134150dc71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.492778 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.496243 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e335707-03f3-4791-85c7-95134150dc71-kube-api-access-9pfjk" (OuterVolumeSpecName: "kube-api-access-9pfjk") pod "7e335707-03f3-4791-85c7-95134150dc71" (UID: "7e335707-03f3-4791-85c7-95134150dc71"). InnerVolumeSpecName "kube-api-access-9pfjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.559150 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7b742" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.572395 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e335707-03f3-4791-85c7-95134150dc71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e335707-03f3-4791-85c7-95134150dc71" (UID: "7e335707-03f3-4791-85c7-95134150dc71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.584949 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa8bd21f-2da9-4ece-917f-091ada68a2bd-catalog-content\") pod \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\" (UID: \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.585007 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/adf825e0-c430-439f-9d7c-55b7582f1b54-marketplace-operator-metrics\") pod \"adf825e0-c430-439f-9d7c-55b7582f1b54\" (UID: \"adf825e0-c430-439f-9d7c-55b7582f1b54\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.585066 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa8bd21f-2da9-4ece-917f-091ada68a2bd-utilities\") pod \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\" (UID: \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.585096 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adf825e0-c430-439f-9d7c-55b7582f1b54-marketplace-trusted-ca\") pod \"adf825e0-c430-439f-9d7c-55b7582f1b54\" (UID: \"adf825e0-c430-439f-9d7c-55b7582f1b54\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.585123 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmc2f\" (UniqueName: \"kubernetes.io/projected/adf825e0-c430-439f-9d7c-55b7582f1b54-kube-api-access-gmc2f\") pod \"adf825e0-c430-439f-9d7c-55b7582f1b54\" (UID: \"adf825e0-c430-439f-9d7c-55b7582f1b54\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.585715 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf825e0-c430-439f-9d7c-55b7582f1b54-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "adf825e0-c430-439f-9d7c-55b7582f1b54" (UID: "adf825e0-c430-439f-9d7c-55b7582f1b54"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.586214 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frqg5\" (UniqueName: \"kubernetes.io/projected/aa8bd21f-2da9-4ece-917f-091ada68a2bd-kube-api-access-frqg5\") pod \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\" (UID: \"aa8bd21f-2da9-4ece-917f-091ada68a2bd\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.586247 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-utilities\") pod \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\" (UID: \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.586271 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n4k7\" (UniqueName: \"kubernetes.io/projected/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-kube-api-access-4n4k7\") pod \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\" (UID: \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.586303 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-utilities\") pod \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\" (UID: \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.586323 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxx9k\" (UniqueName: \"kubernetes.io/projected/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-kube-api-access-mxx9k\") pod \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\" (UID: \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.586472 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pfjk\" (UniqueName: \"kubernetes.io/projected/7e335707-03f3-4791-85c7-95134150dc71-kube-api-access-9pfjk\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.586488 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e335707-03f3-4791-85c7-95134150dc71-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.586496 4697 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adf825e0-c430-439f-9d7c-55b7582f1b54-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.586505 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e335707-03f3-4791-85c7-95134150dc71-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.586537 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa8bd21f-2da9-4ece-917f-091ada68a2bd-utilities" (OuterVolumeSpecName: "utilities") pod "aa8bd21f-2da9-4ece-917f-091ada68a2bd" (UID: "aa8bd21f-2da9-4ece-917f-091ada68a2bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.587180 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-utilities" (OuterVolumeSpecName: "utilities") pod "a54aa135-a749-4f61-8bc7-bfea1dbd70dd" (UID: "a54aa135-a749-4f61-8bc7-bfea1dbd70dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.589626 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-kube-api-access-mxx9k" (OuterVolumeSpecName: "kube-api-access-mxx9k") pod "a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" (UID: "a400c2f0-eb54-4938-a0b1-fa91e1d18ff3"). InnerVolumeSpecName "kube-api-access-mxx9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.590163 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf825e0-c430-439f-9d7c-55b7582f1b54-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "adf825e0-c430-439f-9d7c-55b7582f1b54" (UID: "adf825e0-c430-439f-9d7c-55b7582f1b54"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.590611 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-kube-api-access-4n4k7" (OuterVolumeSpecName: "kube-api-access-4n4k7") pod "a54aa135-a749-4f61-8bc7-bfea1dbd70dd" (UID: "a54aa135-a749-4f61-8bc7-bfea1dbd70dd"). InnerVolumeSpecName "kube-api-access-4n4k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.590776 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa8bd21f-2da9-4ece-917f-091ada68a2bd-kube-api-access-frqg5" (OuterVolumeSpecName: "kube-api-access-frqg5") pod "aa8bd21f-2da9-4ece-917f-091ada68a2bd" (UID: "aa8bd21f-2da9-4ece-917f-091ada68a2bd"). InnerVolumeSpecName "kube-api-access-frqg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.590941 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf825e0-c430-439f-9d7c-55b7582f1b54-kube-api-access-gmc2f" (OuterVolumeSpecName: "kube-api-access-gmc2f") pod "adf825e0-c430-439f-9d7c-55b7582f1b54" (UID: "adf825e0-c430-439f-9d7c-55b7582f1b54"). InnerVolumeSpecName "kube-api-access-gmc2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.594320 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-utilities" (OuterVolumeSpecName: "utilities") pod "a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" (UID: "a400c2f0-eb54-4938-a0b1-fa91e1d18ff3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.617353 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa8bd21f-2da9-4ece-917f-091ada68a2bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa8bd21f-2da9-4ece-917f-091ada68a2bd" (UID: "aa8bd21f-2da9-4ece-917f-091ada68a2bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.687358 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-catalog-content\") pod \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\" (UID: \"a54aa135-a749-4f61-8bc7-bfea1dbd70dd\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.687698 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-catalog-content\") pod \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\" (UID: \"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3\") " Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.688477 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.688518 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxx9k\" (UniqueName: \"kubernetes.io/projected/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-kube-api-access-mxx9k\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.688530 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa8bd21f-2da9-4ece-917f-091ada68a2bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.688539 4697 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/adf825e0-c430-439f-9d7c-55b7582f1b54-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.688548 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa8bd21f-2da9-4ece-917f-091ada68a2bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.688556 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmc2f\" (UniqueName: \"kubernetes.io/projected/adf825e0-c430-439f-9d7c-55b7582f1b54-kube-api-access-gmc2f\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.688565 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frqg5\" (UniqueName: \"kubernetes.io/projected/aa8bd21f-2da9-4ece-917f-091ada68a2bd-kube-api-access-frqg5\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.689127 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.689141 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n4k7\" (UniqueName: \"kubernetes.io/projected/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-kube-api-access-4n4k7\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.740632 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" (UID: "a400c2f0-eb54-4938-a0b1-fa91e1d18ff3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.790952 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.804500 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a54aa135-a749-4f61-8bc7-bfea1dbd70dd" (UID: "a54aa135-a749-4f61-8bc7-bfea1dbd70dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.831292 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-shppd"] Feb 20 16:38:13 crc kubenswrapper[4697]: I0220 16:38:13.892199 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54aa135-a749-4f61-8bc7-bfea1dbd70dd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.138730 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.139290 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m6zk6" event={"ID":"adf825e0-c430-439f-9d7c-55b7582f1b54","Type":"ContainerDied","Data":"7cb57bc510ddd5d77bf06f19139ca8d9fd707ad6b0dbb96dfe7d73814ff63069"} Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.139339 4697 scope.go:117] "RemoveContainer" containerID="bac1d2fa294d54ed5e6cd079d5877b9120b43a67a747ad83a6871381cf22b30e" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.141385 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcxf" event={"ID":"7e335707-03f3-4791-85c7-95134150dc71","Type":"ContainerDied","Data":"871317ba189169fdcb506bab56dc9f3d8ddc06baf081481d55d03553b2c1f592"} Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.141470 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjcxf" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.142701 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-shppd" event={"ID":"84dd2186-876f-440f-8187-51f7bdda1bb8","Type":"ContainerStarted","Data":"a0a4d67a08041445dc395337d66c84134fb696b66e5bcd3fbe11067e728aab63"} Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.142751 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-shppd" event={"ID":"84dd2186-876f-440f-8187-51f7bdda1bb8","Type":"ContainerStarted","Data":"07b81696cf3b030603db86d899652cde4e61d4f32d999334d7f8486f872941df"} Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.144458 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.144746 4697 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-shppd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" start-of-body= Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.144778 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-shppd" podUID="84dd2186-876f-440f-8187-51f7bdda1bb8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.149703 4697 generic.go:334] "Generic (PLEG): container finished" podID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" containerID="3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f" exitCode=0 Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.149820 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b742" event={"ID":"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3","Type":"ContainerDied","Data":"3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f"} Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.149867 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7b742" event={"ID":"a400c2f0-eb54-4938-a0b1-fa91e1d18ff3","Type":"ContainerDied","Data":"7d1965fdec4457cdf648de5e19bba67342e81a407c722965f4e57738aca165d2"} Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.150011 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7b742" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.158452 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5bghw" event={"ID":"aa8bd21f-2da9-4ece-917f-091ada68a2bd","Type":"ContainerDied","Data":"8e0fc9d32d791f0ea564662c95ff892b7b7c9c1badbf1254ad2f3187c8ca4119"} Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.158553 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5bghw" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.165212 4697 scope.go:117] "RemoveContainer" containerID="ed1f384c339be882e0e0c49f6fd07edbb92250c96c82721663b91857f0db2dba" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.169222 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-shppd" podStartSLOduration=2.169205092 podStartE2EDuration="2.169205092s" podCreationTimestamp="2026-02-20 16:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:38:14.166098903 +0000 UTC m=+401.946144311" watchObservedRunningTime="2026-02-20 16:38:14.169205092 +0000 UTC m=+401.949250500" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.172222 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4zh8s" event={"ID":"a54aa135-a749-4f61-8bc7-bfea1dbd70dd","Type":"ContainerDied","Data":"403749bc2c9e7f1a7790937e5e7a1f059a6078bb14ce9b87790d249827906471"} Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.172278 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4zh8s" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.206964 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m6zk6"] Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.210698 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m6zk6"] Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.213097 4697 scope.go:117] "RemoveContainer" containerID="f30e40dfd9ef2c444770096c127233d04b5380fde7e5fe1e5110fe8a8cc9fa41" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.238313 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7b742"] Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.245579 4697 scope.go:117] "RemoveContainer" containerID="71660c9cd6894146428f2b1cc14feb5449a75ea175f32787fed93797acf489a1" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.249534 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7b742"] Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.253274 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5bghw"] Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.259748 4697 scope.go:117] "RemoveContainer" containerID="3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.259895 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5bghw"] Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.263556 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjcxf"] Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.266188 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qjcxf"] Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.271493 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4zh8s"] Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.272130 4697 scope.go:117] "RemoveContainer" containerID="ff57adb478e0c2995e1187103d8b359f8a495d1dcef8b25c6d2d007255624c44" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.275270 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4zh8s"] Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.286620 4697 scope.go:117] "RemoveContainer" containerID="16942f6820ce80eb288c38fcdca5fb819586b98462d96099abe6d6ec18b99e6f" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.298656 4697 scope.go:117] "RemoveContainer" containerID="3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.298993 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f\": container with ID starting with 3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f not found: ID does not exist" containerID="3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.299018 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f"} err="failed to get container status \"3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f\": rpc error: code = NotFound desc = could not find container \"3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f\": container with ID starting with 3ecabe81178d22934d468c9109e1f94379b153984a6bb26df5e144bac57c515f not found: ID does not exist" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.299035 4697 scope.go:117] "RemoveContainer" containerID="ff57adb478e0c2995e1187103d8b359f8a495d1dcef8b25c6d2d007255624c44" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.300193 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff57adb478e0c2995e1187103d8b359f8a495d1dcef8b25c6d2d007255624c44\": container with ID starting with ff57adb478e0c2995e1187103d8b359f8a495d1dcef8b25c6d2d007255624c44 not found: ID does not exist" containerID="ff57adb478e0c2995e1187103d8b359f8a495d1dcef8b25c6d2d007255624c44" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.300213 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff57adb478e0c2995e1187103d8b359f8a495d1dcef8b25c6d2d007255624c44"} err="failed to get container status \"ff57adb478e0c2995e1187103d8b359f8a495d1dcef8b25c6d2d007255624c44\": rpc error: code = NotFound desc = could not find container \"ff57adb478e0c2995e1187103d8b359f8a495d1dcef8b25c6d2d007255624c44\": container with ID starting with ff57adb478e0c2995e1187103d8b359f8a495d1dcef8b25c6d2d007255624c44 not found: ID does not exist" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.300228 4697 scope.go:117] "RemoveContainer" containerID="16942f6820ce80eb288c38fcdca5fb819586b98462d96099abe6d6ec18b99e6f" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.300664 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16942f6820ce80eb288c38fcdca5fb819586b98462d96099abe6d6ec18b99e6f\": container with ID starting with 16942f6820ce80eb288c38fcdca5fb819586b98462d96099abe6d6ec18b99e6f not found: ID does not exist" containerID="16942f6820ce80eb288c38fcdca5fb819586b98462d96099abe6d6ec18b99e6f" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.300701 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16942f6820ce80eb288c38fcdca5fb819586b98462d96099abe6d6ec18b99e6f"} err="failed to get container status \"16942f6820ce80eb288c38fcdca5fb819586b98462d96099abe6d6ec18b99e6f\": rpc error: code = NotFound desc = could not find container \"16942f6820ce80eb288c38fcdca5fb819586b98462d96099abe6d6ec18b99e6f\": container with ID starting with 16942f6820ce80eb288c38fcdca5fb819586b98462d96099abe6d6ec18b99e6f not found: ID does not exist" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.300728 4697 scope.go:117] "RemoveContainer" containerID="542d55895b6d3b6963278b0d53fec392be67737fe5e6919aa5b14cb7061cfaa2" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.314893 4697 scope.go:117] "RemoveContainer" containerID="624197d830ca5b583633b86caaa0b2d582a10afd1fc894758a6fb35e30dd1e12" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.326538 4697 scope.go:117] "RemoveContainer" containerID="8cd83e91dfafd1e82abb084af428146d7885d381d0e7da57e91488fde863d1f2" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.340554 4697 scope.go:117] "RemoveContainer" containerID="150322f414cdd99b3cfea31ef5d9ab39ee374a47f47736ff3bca914459f5a3e0" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.355568 4697 scope.go:117] "RemoveContainer" containerID="2bf8657f5a02542c6f29ec1668ac422d17969f3d108ac1821a41526040959380" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.380342 4697 scope.go:117] "RemoveContainer" containerID="7467c838bd30986fafdc9bca0f64e3cfbe67ba7f0eddf735e9fe77439e99277a" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.883379 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e335707-03f3-4791-85c7-95134150dc71" path="/var/lib/kubelet/pods/7e335707-03f3-4791-85c7-95134150dc71/volumes" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.884046 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" path="/var/lib/kubelet/pods/a400c2f0-eb54-4938-a0b1-fa91e1d18ff3/volumes" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.884627 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" path="/var/lib/kubelet/pods/a54aa135-a749-4f61-8bc7-bfea1dbd70dd/volumes" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.885637 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa8bd21f-2da9-4ece-917f-091ada68a2bd" path="/var/lib/kubelet/pods/aa8bd21f-2da9-4ece-917f-091ada68a2bd/volumes" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.886240 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf825e0-c430-439f-9d7c-55b7582f1b54" path="/var/lib/kubelet/pods/adf825e0-c430-439f-9d7c-55b7582f1b54/volumes" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.976849 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qbll4"] Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.977598 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e335707-03f3-4791-85c7-95134150dc71" containerName="extract-utilities" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.977813 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e335707-03f3-4791-85c7-95134150dc71" containerName="extract-utilities" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.978007 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf825e0-c430-439f-9d7c-55b7582f1b54" containerName="marketplace-operator" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.978183 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf825e0-c430-439f-9d7c-55b7582f1b54" containerName="marketplace-operator" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.978420 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e335707-03f3-4791-85c7-95134150dc71" containerName="extract-content" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.978676 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e335707-03f3-4791-85c7-95134150dc71" containerName="extract-content" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.978906 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" containerName="registry-server" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.979189 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" containerName="registry-server" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.979413 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf825e0-c430-439f-9d7c-55b7582f1b54" containerName="marketplace-operator" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.979641 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf825e0-c430-439f-9d7c-55b7582f1b54" containerName="marketplace-operator" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.979848 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" containerName="extract-content" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.980040 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" containerName="extract-content" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.980245 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" containerName="registry-server" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.980465 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" containerName="registry-server" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.980655 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8bd21f-2da9-4ece-917f-091ada68a2bd" containerName="extract-content" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.980830 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8bd21f-2da9-4ece-917f-091ada68a2bd" containerName="extract-content" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.981015 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e335707-03f3-4791-85c7-95134150dc71" containerName="registry-server" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.981209 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e335707-03f3-4791-85c7-95134150dc71" containerName="registry-server" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.981417 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8bd21f-2da9-4ece-917f-091ada68a2bd" containerName="extract-utilities" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.981640 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8bd21f-2da9-4ece-917f-091ada68a2bd" containerName="extract-utilities" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.981848 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" containerName="extract-content" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.982021 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" containerName="extract-content" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.982221 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8bd21f-2da9-4ece-917f-091ada68a2bd" containerName="registry-server" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.982410 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8bd21f-2da9-4ece-917f-091ada68a2bd" containerName="registry-server" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.982661 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" containerName="extract-utilities" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.982882 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" containerName="extract-utilities" Feb 20 16:38:14 crc kubenswrapper[4697]: E0220 16:38:14.983069 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" containerName="extract-utilities" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.983256 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" containerName="extract-utilities" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.983697 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf825e0-c430-439f-9d7c-55b7582f1b54" containerName="marketplace-operator" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.983925 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="a400c2f0-eb54-4938-a0b1-fa91e1d18ff3" containerName="registry-server" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.984118 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e335707-03f3-4791-85c7-95134150dc71" containerName="registry-server" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.984317 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa8bd21f-2da9-4ece-917f-091ada68a2bd" containerName="registry-server" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.984541 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54aa135-a749-4f61-8bc7-bfea1dbd70dd" containerName="registry-server" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.985221 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf825e0-c430-439f-9d7c-55b7582f1b54" containerName="marketplace-operator" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.986743 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.987265 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qbll4"] Feb 20 16:38:14 crc kubenswrapper[4697]: I0220 16:38:14.989767 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.107137 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c701576-ceb0-4bd0-9583-c7025ea0d061-catalog-content\") pod \"certified-operators-qbll4\" (UID: \"5c701576-ceb0-4bd0-9583-c7025ea0d061\") " pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.107515 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c701576-ceb0-4bd0-9583-c7025ea0d061-utilities\") pod \"certified-operators-qbll4\" (UID: \"5c701576-ceb0-4bd0-9583-c7025ea0d061\") " pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.107906 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdkdn\" (UniqueName: \"kubernetes.io/projected/5c701576-ceb0-4bd0-9583-c7025ea0d061-kube-api-access-hdkdn\") pod \"certified-operators-qbll4\" (UID: \"5c701576-ceb0-4bd0-9583-c7025ea0d061\") " pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.167653 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-czzb9"] Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.170854 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.174354 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.182810 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-czzb9"] Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.198107 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-shppd" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.209396 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c701576-ceb0-4bd0-9583-c7025ea0d061-utilities\") pod \"certified-operators-qbll4\" (UID: \"5c701576-ceb0-4bd0-9583-c7025ea0d061\") " pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.209462 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582272c0-0a61-44ac-886c-de82d766c32b-catalog-content\") pod \"community-operators-czzb9\" (UID: \"582272c0-0a61-44ac-886c-de82d766c32b\") " pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.209541 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582272c0-0a61-44ac-886c-de82d766c32b-utilities\") pod \"community-operators-czzb9\" (UID: \"582272c0-0a61-44ac-886c-de82d766c32b\") " pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.209568 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdkdn\" (UniqueName: \"kubernetes.io/projected/5c701576-ceb0-4bd0-9583-c7025ea0d061-kube-api-access-hdkdn\") pod \"certified-operators-qbll4\" (UID: \"5c701576-ceb0-4bd0-9583-c7025ea0d061\") " pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.209641 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c701576-ceb0-4bd0-9583-c7025ea0d061-catalog-content\") pod \"certified-operators-qbll4\" (UID: \"5c701576-ceb0-4bd0-9583-c7025ea0d061\") " pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.209668 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4469\" (UniqueName: \"kubernetes.io/projected/582272c0-0a61-44ac-886c-de82d766c32b-kube-api-access-q4469\") pod \"community-operators-czzb9\" (UID: \"582272c0-0a61-44ac-886c-de82d766c32b\") " pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.210283 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c701576-ceb0-4bd0-9583-c7025ea0d061-utilities\") pod \"certified-operators-qbll4\" (UID: \"5c701576-ceb0-4bd0-9583-c7025ea0d061\") " pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.227091 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c701576-ceb0-4bd0-9583-c7025ea0d061-catalog-content\") pod \"certified-operators-qbll4\" (UID: \"5c701576-ceb0-4bd0-9583-c7025ea0d061\") " pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.234616 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdkdn\" (UniqueName: \"kubernetes.io/projected/5c701576-ceb0-4bd0-9583-c7025ea0d061-kube-api-access-hdkdn\") pod \"certified-operators-qbll4\" (UID: \"5c701576-ceb0-4bd0-9583-c7025ea0d061\") " pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.308563 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.311890 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4469\" (UniqueName: \"kubernetes.io/projected/582272c0-0a61-44ac-886c-de82d766c32b-kube-api-access-q4469\") pod \"community-operators-czzb9\" (UID: \"582272c0-0a61-44ac-886c-de82d766c32b\") " pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.311937 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582272c0-0a61-44ac-886c-de82d766c32b-catalog-content\") pod \"community-operators-czzb9\" (UID: \"582272c0-0a61-44ac-886c-de82d766c32b\") " pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.311969 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582272c0-0a61-44ac-886c-de82d766c32b-utilities\") pod \"community-operators-czzb9\" (UID: \"582272c0-0a61-44ac-886c-de82d766c32b\") " pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.312511 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582272c0-0a61-44ac-886c-de82d766c32b-utilities\") pod \"community-operators-czzb9\" (UID: \"582272c0-0a61-44ac-886c-de82d766c32b\") " pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.313355 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582272c0-0a61-44ac-886c-de82d766c32b-catalog-content\") pod \"community-operators-czzb9\" (UID: \"582272c0-0a61-44ac-886c-de82d766c32b\") " pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.328850 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4469\" (UniqueName: \"kubernetes.io/projected/582272c0-0a61-44ac-886c-de82d766c32b-kube-api-access-q4469\") pod \"community-operators-czzb9\" (UID: \"582272c0-0a61-44ac-886c-de82d766c32b\") " pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.493581 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.696313 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qbll4"] Feb 20 16:38:15 crc kubenswrapper[4697]: W0220 16:38:15.705073 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c701576_ceb0_4bd0_9583_c7025ea0d061.slice/crio-0e73f540a493d54d5b384531c3fe1638d23a60f2318a834a1cc306b3d6213512 WatchSource:0}: Error finding container 0e73f540a493d54d5b384531c3fe1638d23a60f2318a834a1cc306b3d6213512: Status 404 returned error can't find the container with id 0e73f540a493d54d5b384531c3fe1638d23a60f2318a834a1cc306b3d6213512 Feb 20 16:38:15 crc kubenswrapper[4697]: I0220 16:38:15.928691 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-czzb9"] Feb 20 16:38:16 crc kubenswrapper[4697]: I0220 16:38:16.200291 4697 generic.go:334] "Generic (PLEG): container finished" podID="582272c0-0a61-44ac-886c-de82d766c32b" containerID="a5bc8c21ec397e73b8e4e1711d49bd19f1377fb8d289751da27430e6ec0980a3" exitCode=0 Feb 20 16:38:16 crc kubenswrapper[4697]: I0220 16:38:16.200406 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czzb9" event={"ID":"582272c0-0a61-44ac-886c-de82d766c32b","Type":"ContainerDied","Data":"a5bc8c21ec397e73b8e4e1711d49bd19f1377fb8d289751da27430e6ec0980a3"} Feb 20 16:38:16 crc kubenswrapper[4697]: I0220 16:38:16.200479 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czzb9" event={"ID":"582272c0-0a61-44ac-886c-de82d766c32b","Type":"ContainerStarted","Data":"b87071132568e2ca67f0fee2b7a37298b46f43686a161cd4316b587bdd0d47e3"} Feb 20 16:38:16 crc kubenswrapper[4697]: I0220 16:38:16.202246 4697 generic.go:334] "Generic (PLEG): container finished" podID="5c701576-ceb0-4bd0-9583-c7025ea0d061" containerID="1855084f8ce49b79b719d7f696907b47aeff523ab546c4d858fa3bfa51577f10" exitCode=0 Feb 20 16:38:16 crc kubenswrapper[4697]: I0220 16:38:16.202283 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qbll4" event={"ID":"5c701576-ceb0-4bd0-9583-c7025ea0d061","Type":"ContainerDied","Data":"1855084f8ce49b79b719d7f696907b47aeff523ab546c4d858fa3bfa51577f10"} Feb 20 16:38:16 crc kubenswrapper[4697]: I0220 16:38:16.202360 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qbll4" event={"ID":"5c701576-ceb0-4bd0-9583-c7025ea0d061","Type":"ContainerStarted","Data":"0e73f540a493d54d5b384531c3fe1638d23a60f2318a834a1cc306b3d6213512"} Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.208223 4697 generic.go:334] "Generic (PLEG): container finished" podID="582272c0-0a61-44ac-886c-de82d766c32b" containerID="5323c230ce6f51ac8486c0fadb078decd4e3992627af5a70ef7e2fea33b7ad94" exitCode=0 Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.208310 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czzb9" event={"ID":"582272c0-0a61-44ac-886c-de82d766c32b","Type":"ContainerDied","Data":"5323c230ce6f51ac8486c0fadb078decd4e3992627af5a70ef7e2fea33b7ad94"} Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.212175 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qbll4" event={"ID":"5c701576-ceb0-4bd0-9583-c7025ea0d061","Type":"ContainerStarted","Data":"d1fff9fa8d56f8c7e3b140c6783b9b81fe8add9df0aa8b24cc1a649cf635d6e2"} Feb 20 16:38:17 crc kubenswrapper[4697]: E0220 16:38:17.318909 4697 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c701576_ceb0_4bd0_9583_c7025ea0d061.slice/crio-conmon-d1fff9fa8d56f8c7e3b140c6783b9b81fe8add9df0aa8b24cc1a649cf635d6e2.scope\": RecentStats: unable to find data in memory cache]" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.362470 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pr6bg"] Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.363315 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.366613 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.369702 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr6bg"] Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.544938 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55542521-9cd2-46b2-ab39-9545c3e50fea-catalog-content\") pod \"redhat-marketplace-pr6bg\" (UID: \"55542521-9cd2-46b2-ab39-9545c3e50fea\") " pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.545105 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trchh\" (UniqueName: \"kubernetes.io/projected/55542521-9cd2-46b2-ab39-9545c3e50fea-kube-api-access-trchh\") pod \"redhat-marketplace-pr6bg\" (UID: \"55542521-9cd2-46b2-ab39-9545c3e50fea\") " pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.545225 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55542521-9cd2-46b2-ab39-9545c3e50fea-utilities\") pod \"redhat-marketplace-pr6bg\" (UID: \"55542521-9cd2-46b2-ab39-9545c3e50fea\") " pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.559622 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ksnpf"] Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.560765 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.562576 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.577386 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ksnpf"] Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.646265 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55542521-9cd2-46b2-ab39-9545c3e50fea-utilities\") pod \"redhat-marketplace-pr6bg\" (UID: \"55542521-9cd2-46b2-ab39-9545c3e50fea\") " pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.646343 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55542521-9cd2-46b2-ab39-9545c3e50fea-catalog-content\") pod \"redhat-marketplace-pr6bg\" (UID: \"55542521-9cd2-46b2-ab39-9545c3e50fea\") " pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.646419 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trchh\" (UniqueName: \"kubernetes.io/projected/55542521-9cd2-46b2-ab39-9545c3e50fea-kube-api-access-trchh\") pod \"redhat-marketplace-pr6bg\" (UID: \"55542521-9cd2-46b2-ab39-9545c3e50fea\") " pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.646869 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55542521-9cd2-46b2-ab39-9545c3e50fea-utilities\") pod \"redhat-marketplace-pr6bg\" (UID: \"55542521-9cd2-46b2-ab39-9545c3e50fea\") " pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.646880 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55542521-9cd2-46b2-ab39-9545c3e50fea-catalog-content\") pod \"redhat-marketplace-pr6bg\" (UID: \"55542521-9cd2-46b2-ab39-9545c3e50fea\") " pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.671772 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trchh\" (UniqueName: \"kubernetes.io/projected/55542521-9cd2-46b2-ab39-9545c3e50fea-kube-api-access-trchh\") pod \"redhat-marketplace-pr6bg\" (UID: \"55542521-9cd2-46b2-ab39-9545c3e50fea\") " pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.683576 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.747906 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df2bff45-36b1-4240-9828-29382414ea11-utilities\") pod \"redhat-operators-ksnpf\" (UID: \"df2bff45-36b1-4240-9828-29382414ea11\") " pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.747950 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df2bff45-36b1-4240-9828-29382414ea11-catalog-content\") pod \"redhat-operators-ksnpf\" (UID: \"df2bff45-36b1-4240-9828-29382414ea11\") " pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.748005 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx6mh\" (UniqueName: \"kubernetes.io/projected/df2bff45-36b1-4240-9828-29382414ea11-kube-api-access-bx6mh\") pod \"redhat-operators-ksnpf\" (UID: \"df2bff45-36b1-4240-9828-29382414ea11\") " pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.849633 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx6mh\" (UniqueName: \"kubernetes.io/projected/df2bff45-36b1-4240-9828-29382414ea11-kube-api-access-bx6mh\") pod \"redhat-operators-ksnpf\" (UID: \"df2bff45-36b1-4240-9828-29382414ea11\") " pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.849948 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df2bff45-36b1-4240-9828-29382414ea11-utilities\") pod \"redhat-operators-ksnpf\" (UID: \"df2bff45-36b1-4240-9828-29382414ea11\") " pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.849988 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df2bff45-36b1-4240-9828-29382414ea11-catalog-content\") pod \"redhat-operators-ksnpf\" (UID: \"df2bff45-36b1-4240-9828-29382414ea11\") " pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.850664 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df2bff45-36b1-4240-9828-29382414ea11-catalog-content\") pod \"redhat-operators-ksnpf\" (UID: \"df2bff45-36b1-4240-9828-29382414ea11\") " pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.850794 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df2bff45-36b1-4240-9828-29382414ea11-utilities\") pod \"redhat-operators-ksnpf\" (UID: \"df2bff45-36b1-4240-9828-29382414ea11\") " pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.868991 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx6mh\" (UniqueName: \"kubernetes.io/projected/df2bff45-36b1-4240-9828-29382414ea11-kube-api-access-bx6mh\") pod \"redhat-operators-ksnpf\" (UID: \"df2bff45-36b1-4240-9828-29382414ea11\") " pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:17 crc kubenswrapper[4697]: I0220 16:38:17.921956 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:18 crc kubenswrapper[4697]: I0220 16:38:18.075955 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr6bg"] Feb 20 16:38:18 crc kubenswrapper[4697]: W0220 16:38:18.082865 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55542521_9cd2_46b2_ab39_9545c3e50fea.slice/crio-97ddc568afbb014bc6a5c2a60930d25d315b8490c03b81f293c2c7d3a3951668 WatchSource:0}: Error finding container 97ddc568afbb014bc6a5c2a60930d25d315b8490c03b81f293c2c7d3a3951668: Status 404 returned error can't find the container with id 97ddc568afbb014bc6a5c2a60930d25d315b8490c03b81f293c2c7d3a3951668 Feb 20 16:38:18 crc kubenswrapper[4697]: I0220 16:38:18.219768 4697 generic.go:334] "Generic (PLEG): container finished" podID="5c701576-ceb0-4bd0-9583-c7025ea0d061" containerID="d1fff9fa8d56f8c7e3b140c6783b9b81fe8add9df0aa8b24cc1a649cf635d6e2" exitCode=0 Feb 20 16:38:18 crc kubenswrapper[4697]: I0220 16:38:18.219845 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qbll4" event={"ID":"5c701576-ceb0-4bd0-9583-c7025ea0d061","Type":"ContainerDied","Data":"d1fff9fa8d56f8c7e3b140c6783b9b81fe8add9df0aa8b24cc1a649cf635d6e2"} Feb 20 16:38:18 crc kubenswrapper[4697]: I0220 16:38:18.225715 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czzb9" event={"ID":"582272c0-0a61-44ac-886c-de82d766c32b","Type":"ContainerStarted","Data":"7e5d97d96c40749c0e12fd0132fea104497c9483f44f2fbbe243376bb7f6e057"} Feb 20 16:38:18 crc kubenswrapper[4697]: I0220 16:38:18.228100 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6bg" event={"ID":"55542521-9cd2-46b2-ab39-9545c3e50fea","Type":"ContainerStarted","Data":"260f44e143f15ff4392afb355baf32094ebdcef483f710e585972b6b4b080dd9"} Feb 20 16:38:18 crc kubenswrapper[4697]: I0220 16:38:18.228990 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6bg" event={"ID":"55542521-9cd2-46b2-ab39-9545c3e50fea","Type":"ContainerStarted","Data":"97ddc568afbb014bc6a5c2a60930d25d315b8490c03b81f293c2c7d3a3951668"} Feb 20 16:38:18 crc kubenswrapper[4697]: I0220 16:38:18.269513 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-czzb9" podStartSLOduration=1.840843427 podStartE2EDuration="3.269494865s" podCreationTimestamp="2026-02-20 16:38:15 +0000 UTC" firstStartedPulling="2026-02-20 16:38:16.20327931 +0000 UTC m=+403.983324718" lastFinishedPulling="2026-02-20 16:38:17.631930728 +0000 UTC m=+405.411976156" observedRunningTime="2026-02-20 16:38:18.266250463 +0000 UTC m=+406.046295871" watchObservedRunningTime="2026-02-20 16:38:18.269494865 +0000 UTC m=+406.049540273" Feb 20 16:38:18 crc kubenswrapper[4697]: I0220 16:38:18.319424 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ksnpf"] Feb 20 16:38:18 crc kubenswrapper[4697]: W0220 16:38:18.327094 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf2bff45_36b1_4240_9828_29382414ea11.slice/crio-97debf124fa3313c87760667553d228afe75eea3827df58b67eaf81704f62575 WatchSource:0}: Error finding container 97debf124fa3313c87760667553d228afe75eea3827df58b67eaf81704f62575: Status 404 returned error can't find the container with id 97debf124fa3313c87760667553d228afe75eea3827df58b67eaf81704f62575 Feb 20 16:38:19 crc kubenswrapper[4697]: I0220 16:38:19.236337 4697 generic.go:334] "Generic (PLEG): container finished" podID="55542521-9cd2-46b2-ab39-9545c3e50fea" containerID="260f44e143f15ff4392afb355baf32094ebdcef483f710e585972b6b4b080dd9" exitCode=0 Feb 20 16:38:19 crc kubenswrapper[4697]: I0220 16:38:19.236675 4697 generic.go:334] "Generic (PLEG): container finished" podID="55542521-9cd2-46b2-ab39-9545c3e50fea" containerID="f6a50e572d136ae0b0a57d346d88aa75515744e8e85536cf00c082419700b819" exitCode=0 Feb 20 16:38:19 crc kubenswrapper[4697]: I0220 16:38:19.236484 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6bg" event={"ID":"55542521-9cd2-46b2-ab39-9545c3e50fea","Type":"ContainerDied","Data":"260f44e143f15ff4392afb355baf32094ebdcef483f710e585972b6b4b080dd9"} Feb 20 16:38:19 crc kubenswrapper[4697]: I0220 16:38:19.236759 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6bg" event={"ID":"55542521-9cd2-46b2-ab39-9545c3e50fea","Type":"ContainerDied","Data":"f6a50e572d136ae0b0a57d346d88aa75515744e8e85536cf00c082419700b819"} Feb 20 16:38:19 crc kubenswrapper[4697]: I0220 16:38:19.238991 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qbll4" event={"ID":"5c701576-ceb0-4bd0-9583-c7025ea0d061","Type":"ContainerStarted","Data":"fe53cd959e500bad09ce581c1ba67b1a8c12b709b8cc309face1c856d8ec1de2"} Feb 20 16:38:19 crc kubenswrapper[4697]: I0220 16:38:19.240614 4697 generic.go:334] "Generic (PLEG): container finished" podID="df2bff45-36b1-4240-9828-29382414ea11" containerID="6d4c5ec7fe9ce7dbaa4a5473a4265aaca30270a17a2af0b4df2979a5bfce8531" exitCode=0 Feb 20 16:38:19 crc kubenswrapper[4697]: I0220 16:38:19.240672 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksnpf" event={"ID":"df2bff45-36b1-4240-9828-29382414ea11","Type":"ContainerDied","Data":"6d4c5ec7fe9ce7dbaa4a5473a4265aaca30270a17a2af0b4df2979a5bfce8531"} Feb 20 16:38:19 crc kubenswrapper[4697]: I0220 16:38:19.240713 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksnpf" event={"ID":"df2bff45-36b1-4240-9828-29382414ea11","Type":"ContainerStarted","Data":"97debf124fa3313c87760667553d228afe75eea3827df58b67eaf81704f62575"} Feb 20 16:38:19 crc kubenswrapper[4697]: I0220 16:38:19.309556 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qbll4" podStartSLOduration=2.899448762 podStartE2EDuration="5.309536602s" podCreationTimestamp="2026-02-20 16:38:14 +0000 UTC" firstStartedPulling="2026-02-20 16:38:16.203797108 +0000 UTC m=+403.983842516" lastFinishedPulling="2026-02-20 16:38:18.613884948 +0000 UTC m=+406.393930356" observedRunningTime="2026-02-20 16:38:19.304093396 +0000 UTC m=+407.084138804" watchObservedRunningTime="2026-02-20 16:38:19.309536602 +0000 UTC m=+407.089582010" Feb 20 16:38:20 crc kubenswrapper[4697]: I0220 16:38:20.247834 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr6bg" event={"ID":"55542521-9cd2-46b2-ab39-9545c3e50fea","Type":"ContainerStarted","Data":"df244ce9e470967a786003301712672f82054c0b4793de22ac9e6a9bebabeb9a"} Feb 20 16:38:20 crc kubenswrapper[4697]: I0220 16:38:20.249976 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksnpf" event={"ID":"df2bff45-36b1-4240-9828-29382414ea11","Type":"ContainerStarted","Data":"fe3dd1787c88fd19a4f403fc0a370a3ed04a19f3897db81322783cec547f89c5"} Feb 20 16:38:20 crc kubenswrapper[4697]: I0220 16:38:20.272132 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pr6bg" podStartSLOduration=1.89760545 podStartE2EDuration="3.272109154s" podCreationTimestamp="2026-02-20 16:38:17 +0000 UTC" firstStartedPulling="2026-02-20 16:38:18.229290469 +0000 UTC m=+406.009335877" lastFinishedPulling="2026-02-20 16:38:19.603794173 +0000 UTC m=+407.383839581" observedRunningTime="2026-02-20 16:38:20.268928634 +0000 UTC m=+408.048974042" watchObservedRunningTime="2026-02-20 16:38:20.272109154 +0000 UTC m=+408.052154572" Feb 20 16:38:21 crc kubenswrapper[4697]: I0220 16:38:21.257742 4697 generic.go:334] "Generic (PLEG): container finished" podID="df2bff45-36b1-4240-9828-29382414ea11" containerID="fe3dd1787c88fd19a4f403fc0a370a3ed04a19f3897db81322783cec547f89c5" exitCode=0 Feb 20 16:38:21 crc kubenswrapper[4697]: I0220 16:38:21.258121 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksnpf" event={"ID":"df2bff45-36b1-4240-9828-29382414ea11","Type":"ContainerDied","Data":"fe3dd1787c88fd19a4f403fc0a370a3ed04a19f3897db81322783cec547f89c5"} Feb 20 16:38:22 crc kubenswrapper[4697]: I0220 16:38:22.264881 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksnpf" event={"ID":"df2bff45-36b1-4240-9828-29382414ea11","Type":"ContainerStarted","Data":"0db6b0b988a0a7b4148167eaad84d410698944722c08815afb99855dbba5478a"} Feb 20 16:38:25 crc kubenswrapper[4697]: I0220 16:38:25.309118 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:25 crc kubenswrapper[4697]: I0220 16:38:25.309728 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:25 crc kubenswrapper[4697]: I0220 16:38:25.354284 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:25 crc kubenswrapper[4697]: I0220 16:38:25.374538 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ksnpf" podStartSLOduration=5.99699264 podStartE2EDuration="8.374522445s" podCreationTimestamp="2026-02-20 16:38:17 +0000 UTC" firstStartedPulling="2026-02-20 16:38:19.241883083 +0000 UTC m=+407.021928501" lastFinishedPulling="2026-02-20 16:38:21.619412898 +0000 UTC m=+409.399458306" observedRunningTime="2026-02-20 16:38:22.288786807 +0000 UTC m=+410.068832215" watchObservedRunningTime="2026-02-20 16:38:25.374522445 +0000 UTC m=+413.154567853" Feb 20 16:38:25 crc kubenswrapper[4697]: I0220 16:38:25.494032 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:25 crc kubenswrapper[4697]: I0220 16:38:25.494646 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:25 crc kubenswrapper[4697]: I0220 16:38:25.535477 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:26 crc kubenswrapper[4697]: I0220 16:38:26.338365 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-czzb9" Feb 20 16:38:26 crc kubenswrapper[4697]: I0220 16:38:26.340287 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qbll4" Feb 20 16:38:27 crc kubenswrapper[4697]: I0220 16:38:27.684175 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:27 crc kubenswrapper[4697]: I0220 16:38:27.685598 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:27 crc kubenswrapper[4697]: I0220 16:38:27.722157 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:27 crc kubenswrapper[4697]: I0220 16:38:27.923017 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:27 crc kubenswrapper[4697]: I0220 16:38:27.923085 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:27 crc kubenswrapper[4697]: I0220 16:38:27.974295 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:28 crc kubenswrapper[4697]: I0220 16:38:28.338664 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pr6bg" Feb 20 16:38:28 crc kubenswrapper[4697]: I0220 16:38:28.339123 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ksnpf" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.058885 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" podUID="bf1acafc-f7fd-49ab-b574-9aed683db705" containerName="registry" containerID="cri-o://e3d5e881c62660cf3778ca93631902e664dcbe8ca873445984c3be1904ff2d82" gracePeriod=30 Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.298151 4697 generic.go:334] "Generic (PLEG): container finished" podID="bf1acafc-f7fd-49ab-b574-9aed683db705" containerID="e3d5e881c62660cf3778ca93631902e664dcbe8ca873445984c3be1904ff2d82" exitCode=0 Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.298461 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" event={"ID":"bf1acafc-f7fd-49ab-b574-9aed683db705","Type":"ContainerDied","Data":"e3d5e881c62660cf3778ca93631902e664dcbe8ca873445984c3be1904ff2d82"} Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.512889 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.618197 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf1acafc-f7fd-49ab-b574-9aed683db705-registry-certificates\") pod \"bf1acafc-f7fd-49ab-b574-9aed683db705\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.618250 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf1acafc-f7fd-49ab-b574-9aed683db705-installation-pull-secrets\") pod \"bf1acafc-f7fd-49ab-b574-9aed683db705\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.618274 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-bound-sa-token\") pod \"bf1acafc-f7fd-49ab-b574-9aed683db705\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.618300 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf1acafc-f7fd-49ab-b574-9aed683db705-ca-trust-extracted\") pod \"bf1acafc-f7fd-49ab-b574-9aed683db705\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.618471 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"bf1acafc-f7fd-49ab-b574-9aed683db705\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.618490 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-registry-tls\") pod \"bf1acafc-f7fd-49ab-b574-9aed683db705\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.618521 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf1acafc-f7fd-49ab-b574-9aed683db705-trusted-ca\") pod \"bf1acafc-f7fd-49ab-b574-9aed683db705\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.618543 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw685\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-kube-api-access-mw685\") pod \"bf1acafc-f7fd-49ab-b574-9aed683db705\" (UID: \"bf1acafc-f7fd-49ab-b574-9aed683db705\") " Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.619149 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1acafc-f7fd-49ab-b574-9aed683db705-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bf1acafc-f7fd-49ab-b574-9aed683db705" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.619177 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1acafc-f7fd-49ab-b574-9aed683db705-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf1acafc-f7fd-49ab-b574-9aed683db705" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.623973 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bf1acafc-f7fd-49ab-b574-9aed683db705" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.624702 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-kube-api-access-mw685" (OuterVolumeSpecName: "kube-api-access-mw685") pod "bf1acafc-f7fd-49ab-b574-9aed683db705" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705"). InnerVolumeSpecName "kube-api-access-mw685". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.625509 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1acafc-f7fd-49ab-b574-9aed683db705-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bf1acafc-f7fd-49ab-b574-9aed683db705" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.625679 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf1acafc-f7fd-49ab-b574-9aed683db705" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.635505 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf1acafc-f7fd-49ab-b574-9aed683db705-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bf1acafc-f7fd-49ab-b574-9aed683db705" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.637202 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "bf1acafc-f7fd-49ab-b574-9aed683db705" (UID: "bf1acafc-f7fd-49ab-b574-9aed683db705"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.720263 4697 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf1acafc-f7fd-49ab-b574-9aed683db705-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.720301 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw685\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-kube-api-access-mw685\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.720315 4697 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf1acafc-f7fd-49ab-b574-9aed683db705-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.720325 4697 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf1acafc-f7fd-49ab-b574-9aed683db705-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.720334 4697 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.720345 4697 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf1acafc-f7fd-49ab-b574-9aed683db705-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:29 crc kubenswrapper[4697]: I0220 16:38:29.720355 4697 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf1acafc-f7fd-49ab-b574-9aed683db705-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:38:30 crc kubenswrapper[4697]: I0220 16:38:30.305227 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" event={"ID":"bf1acafc-f7fd-49ab-b574-9aed683db705","Type":"ContainerDied","Data":"3f9696f3e08dadd4eb55fad2e6bb7f5867baca0c7de39bf4cd71633e2481e99d"} Feb 20 16:38:30 crc kubenswrapper[4697]: I0220 16:38:30.305289 4697 scope.go:117] "RemoveContainer" containerID="e3d5e881c62660cf3778ca93631902e664dcbe8ca873445984c3be1904ff2d82" Feb 20 16:38:30 crc kubenswrapper[4697]: I0220 16:38:30.305413 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s4bgz" Feb 20 16:38:30 crc kubenswrapper[4697]: I0220 16:38:30.336305 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s4bgz"] Feb 20 16:38:30 crc kubenswrapper[4697]: I0220 16:38:30.340360 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s4bgz"] Feb 20 16:38:30 crc kubenswrapper[4697]: I0220 16:38:30.884679 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1acafc-f7fd-49ab-b574-9aed683db705" path="/var/lib/kubelet/pods/bf1acafc-f7fd-49ab-b574-9aed683db705/volumes" Feb 20 16:40:01 crc kubenswrapper[4697]: I0220 16:40:01.185350 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:40:01 crc kubenswrapper[4697]: I0220 16:40:01.186119 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:40:31 crc kubenswrapper[4697]: I0220 16:40:31.185069 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:40:31 crc kubenswrapper[4697]: I0220 16:40:31.185642 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:41:01 crc kubenswrapper[4697]: I0220 16:41:01.185347 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:41:01 crc kubenswrapper[4697]: I0220 16:41:01.185877 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:41:01 crc kubenswrapper[4697]: I0220 16:41:01.185923 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:41:01 crc kubenswrapper[4697]: I0220 16:41:01.186587 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37b12a08456eb67b5389b54c7c7590b240888403dd45128a2dd330327f7ef7cb"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 16:41:01 crc kubenswrapper[4697]: I0220 16:41:01.186652 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://37b12a08456eb67b5389b54c7c7590b240888403dd45128a2dd330327f7ef7cb" gracePeriod=600 Feb 20 16:41:02 crc kubenswrapper[4697]: I0220 16:41:02.238647 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="37b12a08456eb67b5389b54c7c7590b240888403dd45128a2dd330327f7ef7cb" exitCode=0 Feb 20 16:41:02 crc kubenswrapper[4697]: I0220 16:41:02.238777 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"37b12a08456eb67b5389b54c7c7590b240888403dd45128a2dd330327f7ef7cb"} Feb 20 16:41:02 crc kubenswrapper[4697]: I0220 16:41:02.239046 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"a39e2b324782ff79e96c097bdf5a12c5992709cf28743465f5ab68009a413113"} Feb 20 16:41:02 crc kubenswrapper[4697]: I0220 16:41:02.239073 4697 scope.go:117] "RemoveContainer" containerID="f85fb4955a3d3f0a5802d8ba1386616b2a43a2300dcd5a99ec3f0c4b0ac3114b" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.204801 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-rs7j9"] Feb 20 16:42:59 crc kubenswrapper[4697]: E0220 16:42:59.205605 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1acafc-f7fd-49ab-b574-9aed683db705" containerName="registry" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.205619 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1acafc-f7fd-49ab-b574-9aed683db705" containerName="registry" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.205747 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1acafc-f7fd-49ab-b574-9aed683db705" containerName="registry" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.206168 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rs7j9" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.208198 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.208405 4697 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wwzhd" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.208634 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.220636 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-rs7j9"] Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.226703 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-x2rxq"] Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.227505 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-x2rxq" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.232876 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zd9ck"] Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.233405 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-zd9ck" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.237058 4697 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-4wcc5" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.238887 4697 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-2h264" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.248874 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-x2rxq"] Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.252839 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zd9ck"] Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.299792 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8nfr\" (UniqueName: \"kubernetes.io/projected/ee3c17f3-a89c-49fa-8cf9-75e4914401cc-kube-api-access-f8nfr\") pod \"cert-manager-858654f9db-x2rxq\" (UID: \"ee3c17f3-a89c-49fa-8cf9-75e4914401cc\") " pod="cert-manager/cert-manager-858654f9db-x2rxq" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.299854 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqjpm\" (UniqueName: \"kubernetes.io/projected/2c82f3c1-cc18-41b5-9489-61f52f31a74a-kube-api-access-kqjpm\") pod \"cert-manager-cainjector-cf98fcc89-rs7j9\" (UID: \"2c82f3c1-cc18-41b5-9489-61f52f31a74a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-rs7j9" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.299919 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnjrs\" (UniqueName: \"kubernetes.io/projected/36e190bc-14ce-4ce6-92ac-d48197515527-kube-api-access-vnjrs\") pod \"cert-manager-webhook-687f57d79b-zd9ck\" (UID: \"36e190bc-14ce-4ce6-92ac-d48197515527\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zd9ck" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.401410 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnjrs\" (UniqueName: \"kubernetes.io/projected/36e190bc-14ce-4ce6-92ac-d48197515527-kube-api-access-vnjrs\") pod \"cert-manager-webhook-687f57d79b-zd9ck\" (UID: \"36e190bc-14ce-4ce6-92ac-d48197515527\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zd9ck" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.401491 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8nfr\" (UniqueName: \"kubernetes.io/projected/ee3c17f3-a89c-49fa-8cf9-75e4914401cc-kube-api-access-f8nfr\") pod \"cert-manager-858654f9db-x2rxq\" (UID: \"ee3c17f3-a89c-49fa-8cf9-75e4914401cc\") " pod="cert-manager/cert-manager-858654f9db-x2rxq" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.401538 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqjpm\" (UniqueName: \"kubernetes.io/projected/2c82f3c1-cc18-41b5-9489-61f52f31a74a-kube-api-access-kqjpm\") pod \"cert-manager-cainjector-cf98fcc89-rs7j9\" (UID: \"2c82f3c1-cc18-41b5-9489-61f52f31a74a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-rs7j9" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.420151 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnjrs\" (UniqueName: \"kubernetes.io/projected/36e190bc-14ce-4ce6-92ac-d48197515527-kube-api-access-vnjrs\") pod \"cert-manager-webhook-687f57d79b-zd9ck\" (UID: \"36e190bc-14ce-4ce6-92ac-d48197515527\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zd9ck" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.421299 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8nfr\" (UniqueName: \"kubernetes.io/projected/ee3c17f3-a89c-49fa-8cf9-75e4914401cc-kube-api-access-f8nfr\") pod \"cert-manager-858654f9db-x2rxq\" (UID: \"ee3c17f3-a89c-49fa-8cf9-75e4914401cc\") " pod="cert-manager/cert-manager-858654f9db-x2rxq" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.426660 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqjpm\" (UniqueName: \"kubernetes.io/projected/2c82f3c1-cc18-41b5-9489-61f52f31a74a-kube-api-access-kqjpm\") pod \"cert-manager-cainjector-cf98fcc89-rs7j9\" (UID: \"2c82f3c1-cc18-41b5-9489-61f52f31a74a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-rs7j9" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.525915 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rs7j9" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.544107 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-x2rxq" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.552201 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-zd9ck" Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.768363 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-x2rxq"] Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.778977 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.815397 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zd9ck"] Feb 20 16:42:59 crc kubenswrapper[4697]: W0220 16:42:59.820227 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36e190bc_14ce_4ce6_92ac_d48197515527.slice/crio-5a52a1120815eff4e54aaee8c494443c4565ebb3e8351866eab94d0fba21e3fc WatchSource:0}: Error finding container 5a52a1120815eff4e54aaee8c494443c4565ebb3e8351866eab94d0fba21e3fc: Status 404 returned error can't find the container with id 5a52a1120815eff4e54aaee8c494443c4565ebb3e8351866eab94d0fba21e3fc Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.937794 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-rs7j9"] Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.969689 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rs7j9" event={"ID":"2c82f3c1-cc18-41b5-9489-61f52f31a74a","Type":"ContainerStarted","Data":"0cb4e26a0e688ff91d65c653c134e736734276b5bcab01af27169eb79e0d74b4"} Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.970656 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-x2rxq" event={"ID":"ee3c17f3-a89c-49fa-8cf9-75e4914401cc","Type":"ContainerStarted","Data":"8ee7f5e1119c31a96b7bbde5328ae43c010771f47d8b39f4e1b79fb006d6d620"} Feb 20 16:42:59 crc kubenswrapper[4697]: I0220 16:42:59.971362 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-zd9ck" event={"ID":"36e190bc-14ce-4ce6-92ac-d48197515527","Type":"ContainerStarted","Data":"5a52a1120815eff4e54aaee8c494443c4565ebb3e8351866eab94d0fba21e3fc"} Feb 20 16:43:01 crc kubenswrapper[4697]: I0220 16:43:01.184394 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:43:01 crc kubenswrapper[4697]: I0220 16:43:01.185478 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:43:03 crc kubenswrapper[4697]: I0220 16:43:03.997932 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-zd9ck" event={"ID":"36e190bc-14ce-4ce6-92ac-d48197515527","Type":"ContainerStarted","Data":"a3246a0f77b1ab755626e9618dcbce6226b942c7c7ad27a879597a8fee3b2b15"} Feb 20 16:43:04 crc kubenswrapper[4697]: I0220 16:43:03.998739 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-zd9ck" Feb 20 16:43:04 crc kubenswrapper[4697]: I0220 16:43:04.002110 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rs7j9" event={"ID":"2c82f3c1-cc18-41b5-9489-61f52f31a74a","Type":"ContainerStarted","Data":"b6062633363904cee5c4e8874a8a9ecf536ec6e75aa1ba3cee9c735381b152d3"} Feb 20 16:43:04 crc kubenswrapper[4697]: I0220 16:43:04.004015 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-x2rxq" event={"ID":"ee3c17f3-a89c-49fa-8cf9-75e4914401cc","Type":"ContainerStarted","Data":"ec2e635e5b93cfadaba9a702529c719f77251b1c9652f646f3ed61475dc888c9"} Feb 20 16:43:04 crc kubenswrapper[4697]: I0220 16:43:04.018874 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-zd9ck" podStartSLOduration=1.045424638 podStartE2EDuration="5.018850706s" podCreationTimestamp="2026-02-20 16:42:59 +0000 UTC" firstStartedPulling="2026-02-20 16:42:59.822078078 +0000 UTC m=+687.602123486" lastFinishedPulling="2026-02-20 16:43:03.795504146 +0000 UTC m=+691.575549554" observedRunningTime="2026-02-20 16:43:04.014423899 +0000 UTC m=+691.794469327" watchObservedRunningTime="2026-02-20 16:43:04.018850706 +0000 UTC m=+691.798896134" Feb 20 16:43:04 crc kubenswrapper[4697]: I0220 16:43:04.050717 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rs7j9" podStartSLOduration=1.2178591189999999 podStartE2EDuration="5.050682984s" podCreationTimestamp="2026-02-20 16:42:59 +0000 UTC" firstStartedPulling="2026-02-20 16:42:59.946656664 +0000 UTC m=+687.726702072" lastFinishedPulling="2026-02-20 16:43:03.779480529 +0000 UTC m=+691.559525937" observedRunningTime="2026-02-20 16:43:04.029106774 +0000 UTC m=+691.809152182" watchObservedRunningTime="2026-02-20 16:43:04.050682984 +0000 UTC m=+691.830728402" Feb 20 16:43:04 crc kubenswrapper[4697]: I0220 16:43:04.071320 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-x2rxq" podStartSLOduration=1.073491705 podStartE2EDuration="5.071293482s" podCreationTimestamp="2026-02-20 16:42:59 +0000 UTC" firstStartedPulling="2026-02-20 16:42:59.777073151 +0000 UTC m=+687.557118569" lastFinishedPulling="2026-02-20 16:43:03.774874938 +0000 UTC m=+691.554920346" observedRunningTime="2026-02-20 16:43:04.070701418 +0000 UTC m=+691.850746836" watchObservedRunningTime="2026-02-20 16:43:04.071293482 +0000 UTC m=+691.851338900" Feb 20 16:43:09 crc kubenswrapper[4697]: I0220 16:43:09.556078 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-zd9ck" Feb 20 16:43:26 crc kubenswrapper[4697]: I0220 16:43:26.698077 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9zpdc"] Feb 20 16:43:26 crc kubenswrapper[4697]: I0220 16:43:26.699373 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovn-controller" containerID="cri-o://38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00" gracePeriod=30 Feb 20 16:43:26 crc kubenswrapper[4697]: I0220 16:43:26.700082 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="kube-rbac-proxy-node" containerID="cri-o://ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b" gracePeriod=30 Feb 20 16:43:26 crc kubenswrapper[4697]: I0220 16:43:26.700287 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovn-acl-logging" containerID="cri-o://0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725" gracePeriod=30 Feb 20 16:43:26 crc kubenswrapper[4697]: I0220 16:43:26.700260 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="sbdb" containerID="cri-o://e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff" gracePeriod=30 Feb 20 16:43:26 crc kubenswrapper[4697]: I0220 16:43:26.700143 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="nbdb" containerID="cri-o://12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0" gracePeriod=30 Feb 20 16:43:26 crc kubenswrapper[4697]: I0220 16:43:26.700042 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a" gracePeriod=30 Feb 20 16:43:26 crc kubenswrapper[4697]: I0220 16:43:26.699951 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="northd" containerID="cri-o://c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63" gracePeriod=30 Feb 20 16:43:26 crc kubenswrapper[4697]: E0220 16:43:26.753678 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Feb 20 16:43:26 crc kubenswrapper[4697]: E0220 16:43:26.753725 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Feb 20 16:43:26 crc kubenswrapper[4697]: E0220 16:43:26.760909 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Feb 20 16:43:26 crc kubenswrapper[4697]: I0220 16:43:26.763604 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" containerID="cri-o://2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f" gracePeriod=30 Feb 20 16:43:26 crc kubenswrapper[4697]: E0220 16:43:26.766815 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Feb 20 16:43:26 crc kubenswrapper[4697]: E0220 16:43:26.766888 4697 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="sbdb" Feb 20 16:43:26 crc kubenswrapper[4697]: E0220 16:43:26.767749 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Feb 20 16:43:26 crc kubenswrapper[4697]: E0220 16:43:26.776077 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Feb 20 16:43:26 crc kubenswrapper[4697]: E0220 16:43:26.776151 4697 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="nbdb" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.034411 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/3.log" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.037130 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovn-acl-logging/0.log" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.037669 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovn-controller/0.log" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.038311 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.092659 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-26bt5"] Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.092897 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="kubecfg-setup" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.092911 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="kubecfg-setup" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.092924 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.092932 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.092942 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="northd" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.092949 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="northd" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.092959 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="nbdb" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.092965 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="nbdb" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.092974 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.092980 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.092991 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093028 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.093040 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093047 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.093058 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovn-acl-logging" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093065 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovn-acl-logging" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.093076 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="sbdb" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093083 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="sbdb" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.093096 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093104 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.093115 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="kube-rbac-proxy-node" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093122 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="kube-rbac-proxy-node" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.093133 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovn-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093141 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovn-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093259 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="sbdb" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093269 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093276 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovn-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093287 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="northd" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093294 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovn-acl-logging" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093305 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="kube-rbac-proxy-node" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093316 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="nbdb" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093326 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093334 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093341 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093352 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.093470 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093480 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.093602 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" containerName="ovnkube-controller" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.095402 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.141326 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lrpxf_1de5dc4e-ef42-48fc-be23-eaec2039c031/kube-multus/2.log" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.142110 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lrpxf_1de5dc4e-ef42-48fc-be23-eaec2039c031/kube-multus/1.log" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.142173 4697 generic.go:334] "Generic (PLEG): container finished" podID="1de5dc4e-ef42-48fc-be23-eaec2039c031" containerID="3c189a1fdd8a35950990c7aaff7044115c85864154e3618a32b8b7eaf68d188d" exitCode=2 Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.142257 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lrpxf" event={"ID":"1de5dc4e-ef42-48fc-be23-eaec2039c031","Type":"ContainerDied","Data":"3c189a1fdd8a35950990c7aaff7044115c85864154e3618a32b8b7eaf68d188d"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.142302 4697 scope.go:117] "RemoveContainer" containerID="a8b038b9ead0bc9a97b50c6f4c8bc6e710b43746fc631bec4a60f4514fc68175" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.142810 4697 scope.go:117] "RemoveContainer" containerID="3c189a1fdd8a35950990c7aaff7044115c85864154e3618a32b8b7eaf68d188d" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.143122 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-lrpxf_openshift-multus(1de5dc4e-ef42-48fc-be23-eaec2039c031)\"" pod="openshift-multus/multus-lrpxf" podUID="1de5dc4e-ef42-48fc-be23-eaec2039c031" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.145567 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovnkube-controller/3.log" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.150643 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovn-acl-logging/0.log" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.151252 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zpdc_99eb233c-7094-4a86-ab37-0b160001bbef/ovn-controller/0.log" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.151934 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.151880 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f" exitCode=0 Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152000 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152014 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff" exitCode=0 Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152025 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0" exitCode=0 Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152036 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63" exitCode=0 Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152047 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a" exitCode=0 Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152077 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152092 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152102 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152114 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152132 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152147 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152160 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152165 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152171 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152176 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152181 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152186 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152192 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152197 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152202 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152222 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b" exitCode=0 Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152229 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725" exitCode=143 Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152237 4697 generic.go:334] "Generic (PLEG): container finished" podID="99eb233c-7094-4a86-ab37-0b160001bbef" containerID="38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00" exitCode=143 Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152251 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152260 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152266 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152272 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152279 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152301 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152306 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152312 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152317 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152323 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152328 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152335 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152343 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152349 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152354 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152359 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152377 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152382 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152387 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152392 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152397 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152402 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152412 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zpdc" event={"ID":"99eb233c-7094-4a86-ab37-0b160001bbef","Type":"ContainerDied","Data":"e089f465ff1499c7b6927f2a1d6395273892a6e3ad36ce8abdb8d491f59a5da1"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152421 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152427 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152451 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152458 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152463 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152468 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152474 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152479 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152485 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.152490 4697 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c"} Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.167014 4697 scope.go:117] "RemoveContainer" containerID="2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.178158 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-node-log\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.178216 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-slash\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.178271 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-slash" (OuterVolumeSpecName: "host-slash") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.178303 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-cni-bin\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.178327 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-run-netns\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.178383 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99eb233c-7094-4a86-ab37-0b160001bbef-ovn-node-metrics-cert\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.178308 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-node-log" (OuterVolumeSpecName: "node-log") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.178326 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.178346 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.178868 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.178913 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.179243 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-env-overrides\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.179276 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-run-ovn-kubernetes\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.179296 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-systemd\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.179327 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-ovnkube-config\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.179383 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.179786 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.179972 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-systemd-units\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.179998 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-openvswitch\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.180106 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.180111 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.180148 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-etc-openvswitch\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.180137 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.180176 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-var-lib-openvswitch\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.180961 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-ovnkube-script-lib\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.180196 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.180220 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.181036 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-ovn\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.181216 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.181502 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.181538 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-kubelet\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.181597 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5wwd\" (UniqueName: \"kubernetes.io/projected/99eb233c-7094-4a86-ab37-0b160001bbef-kube-api-access-z5wwd\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.181622 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-cni-netd\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.181668 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.181871 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.181934 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-log-socket\") pod \"99eb233c-7094-4a86-ab37-0b160001bbef\" (UID: \"99eb233c-7094-4a86-ab37-0b160001bbef\") " Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182033 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-log-socket" (OuterVolumeSpecName: "log-socket") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182084 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-ovnkube-config\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182117 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-ovn-node-metrics-cert\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182151 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-run-ovn-kubernetes\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182184 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-var-lib-openvswitch\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182297 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-cni-netd\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182360 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-run-systemd\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182396 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-env-overrides\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182618 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182672 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-slash\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182723 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-run-ovn\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182766 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-ovnkube-script-lib\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.182982 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-log-socket\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183066 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-run-openvswitch\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183089 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9xl2\" (UniqueName: \"kubernetes.io/projected/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-kube-api-access-x9xl2\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183121 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-kubelet\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183138 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-systemd-units\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183163 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-etc-openvswitch\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183207 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-run-netns\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183261 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-node-log\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183662 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-cni-bin\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183753 4697 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-log-socket\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183768 4697 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-node-log\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183778 4697 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-slash\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183787 4697 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183795 4697 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183806 4697 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183823 4697 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183832 4697 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183832 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99eb233c-7094-4a86-ab37-0b160001bbef-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183842 4697 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183872 4697 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183881 4697 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183890 4697 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183893 4697 scope.go:117] "RemoveContainer" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183891 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99eb233c-7094-4a86-ab37-0b160001bbef-kube-api-access-z5wwd" (OuterVolumeSpecName: "kube-api-access-z5wwd") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "kube-api-access-z5wwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183898 4697 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183923 4697 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99eb233c-7094-4a86-ab37-0b160001bbef-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183934 4697 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183945 4697 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.183956 4697 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.194861 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "99eb233c-7094-4a86-ab37-0b160001bbef" (UID: "99eb233c-7094-4a86-ab37-0b160001bbef"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.198572 4697 scope.go:117] "RemoveContainer" containerID="e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.215054 4697 scope.go:117] "RemoveContainer" containerID="12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.232761 4697 scope.go:117] "RemoveContainer" containerID="c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.246775 4697 scope.go:117] "RemoveContainer" containerID="76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.258965 4697 scope.go:117] "RemoveContainer" containerID="ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.270930 4697 scope.go:117] "RemoveContainer" containerID="0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.283146 4697 scope.go:117] "RemoveContainer" containerID="38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284222 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-run-systemd\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284243 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-env-overrides\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284268 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284291 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-slash\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284307 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-run-ovn\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284328 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-ovnkube-script-lib\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284330 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-run-systemd\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284342 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-log-socket\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284367 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-log-socket\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284378 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-run-openvswitch\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284404 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9xl2\" (UniqueName: \"kubernetes.io/projected/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-kube-api-access-x9xl2\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284426 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-kubelet\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284522 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-systemd-units\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284546 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-etc-openvswitch\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284578 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-run-netns\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284651 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-node-log\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284684 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-cni-bin\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284708 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-ovnkube-config\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284730 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-ovn-node-metrics-cert\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284754 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-run-ovn-kubernetes\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284780 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-var-lib-openvswitch\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284802 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-cni-netd\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284829 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284843 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5wwd\" (UniqueName: \"kubernetes.io/projected/99eb233c-7094-4a86-ab37-0b160001bbef-kube-api-access-z5wwd\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284875 4697 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99eb233c-7094-4a86-ab37-0b160001bbef-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284896 4697 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99eb233c-7094-4a86-ab37-0b160001bbef-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284897 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-run-openvswitch\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284877 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-cni-netd\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284915 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-env-overrides\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284942 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-slash\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284956 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-run-ovn\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.284982 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-node-log\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.285002 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-kubelet\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.285022 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-systemd-units\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.285043 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-etc-openvswitch\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.285062 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-run-netns\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.285218 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-run-ovn-kubernetes\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.285259 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-var-lib-openvswitch\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.285388 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-host-cni-bin\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.285659 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-ovnkube-script-lib\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.285834 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-ovnkube-config\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.287654 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-ovn-node-metrics-cert\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.297405 4697 scope.go:117] "RemoveContainer" containerID="c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.303687 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9xl2\" (UniqueName: \"kubernetes.io/projected/6d2edaa2-a962-40a6-8cfc-66236ebf7a69-kube-api-access-x9xl2\") pod \"ovnkube-node-26bt5\" (UID: \"6d2edaa2-a962-40a6-8cfc-66236ebf7a69\") " pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.308240 4697 scope.go:117] "RemoveContainer" containerID="2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.308587 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f\": container with ID starting with 2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f not found: ID does not exist" containerID="2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.308623 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f"} err="failed to get container status \"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f\": rpc error: code = NotFound desc = could not find container \"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f\": container with ID starting with 2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.308647 4697 scope.go:117] "RemoveContainer" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.308941 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\": container with ID starting with 1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41 not found: ID does not exist" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.308964 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41"} err="failed to get container status \"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\": rpc error: code = NotFound desc = could not find container \"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\": container with ID starting with 1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.308981 4697 scope.go:117] "RemoveContainer" containerID="e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.309376 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\": container with ID starting with e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff not found: ID does not exist" containerID="e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.309516 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff"} err="failed to get container status \"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\": rpc error: code = NotFound desc = could not find container \"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\": container with ID starting with e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.309604 4697 scope.go:117] "RemoveContainer" containerID="12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.310091 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\": container with ID starting with 12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0 not found: ID does not exist" containerID="12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.310116 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0"} err="failed to get container status \"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\": rpc error: code = NotFound desc = could not find container \"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\": container with ID starting with 12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.310133 4697 scope.go:117] "RemoveContainer" containerID="c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.310357 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\": container with ID starting with c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63 not found: ID does not exist" containerID="c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.310444 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63"} err="failed to get container status \"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\": rpc error: code = NotFound desc = could not find container \"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\": container with ID starting with c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.310519 4697 scope.go:117] "RemoveContainer" containerID="76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.310987 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\": container with ID starting with 76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a not found: ID does not exist" containerID="76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.311012 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a"} err="failed to get container status \"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\": rpc error: code = NotFound desc = could not find container \"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\": container with ID starting with 76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.311027 4697 scope.go:117] "RemoveContainer" containerID="ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.311292 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\": container with ID starting with ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b not found: ID does not exist" containerID="ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.311320 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b"} err="failed to get container status \"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\": rpc error: code = NotFound desc = could not find container \"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\": container with ID starting with ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.311337 4697 scope.go:117] "RemoveContainer" containerID="0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.311622 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\": container with ID starting with 0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725 not found: ID does not exist" containerID="0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.311699 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725"} err="failed to get container status \"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\": rpc error: code = NotFound desc = could not find container \"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\": container with ID starting with 0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.311759 4697 scope.go:117] "RemoveContainer" containerID="38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.312060 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\": container with ID starting with 38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00 not found: ID does not exist" containerID="38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.312090 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00"} err="failed to get container status \"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\": rpc error: code = NotFound desc = could not find container \"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\": container with ID starting with 38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.312108 4697 scope.go:117] "RemoveContainer" containerID="c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c" Feb 20 16:43:27 crc kubenswrapper[4697]: E0220 16:43:27.312349 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\": container with ID starting with c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c not found: ID does not exist" containerID="c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.312423 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c"} err="failed to get container status \"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\": rpc error: code = NotFound desc = could not find container \"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\": container with ID starting with c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.312530 4697 scope.go:117] "RemoveContainer" containerID="2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.312786 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f"} err="failed to get container status \"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f\": rpc error: code = NotFound desc = could not find container \"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f\": container with ID starting with 2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.312858 4697 scope.go:117] "RemoveContainer" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.313245 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41"} err="failed to get container status \"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\": rpc error: code = NotFound desc = could not find container \"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\": container with ID starting with 1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.313321 4697 scope.go:117] "RemoveContainer" containerID="e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.313676 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff"} err="failed to get container status \"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\": rpc error: code = NotFound desc = could not find container \"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\": container with ID starting with e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.313700 4697 scope.go:117] "RemoveContainer" containerID="12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.314008 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0"} err="failed to get container status \"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\": rpc error: code = NotFound desc = could not find container \"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\": container with ID starting with 12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.314035 4697 scope.go:117] "RemoveContainer" containerID="c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.314242 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63"} err="failed to get container status \"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\": rpc error: code = NotFound desc = could not find container \"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\": container with ID starting with c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.314266 4697 scope.go:117] "RemoveContainer" containerID="76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.314554 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a"} err="failed to get container status \"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\": rpc error: code = NotFound desc = could not find container \"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\": container with ID starting with 76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.314577 4697 scope.go:117] "RemoveContainer" containerID="ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.314843 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b"} err="failed to get container status \"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\": rpc error: code = NotFound desc = could not find container \"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\": container with ID starting with ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.314863 4697 scope.go:117] "RemoveContainer" containerID="0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.315113 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725"} err="failed to get container status \"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\": rpc error: code = NotFound desc = could not find container \"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\": container with ID starting with 0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.315141 4697 scope.go:117] "RemoveContainer" containerID="38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.315381 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00"} err="failed to get container status \"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\": rpc error: code = NotFound desc = could not find container \"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\": container with ID starting with 38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.315400 4697 scope.go:117] "RemoveContainer" containerID="c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.315623 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c"} err="failed to get container status \"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\": rpc error: code = NotFound desc = could not find container \"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\": container with ID starting with c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.315643 4697 scope.go:117] "RemoveContainer" containerID="2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.315849 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f"} err="failed to get container status \"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f\": rpc error: code = NotFound desc = could not find container \"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f\": container with ID starting with 2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.315867 4697 scope.go:117] "RemoveContainer" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.316132 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41"} err="failed to get container status \"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\": rpc error: code = NotFound desc = could not find container \"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\": container with ID starting with 1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.316206 4697 scope.go:117] "RemoveContainer" containerID="e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.316526 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff"} err="failed to get container status \"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\": rpc error: code = NotFound desc = could not find container \"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\": container with ID starting with e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.316547 4697 scope.go:117] "RemoveContainer" containerID="12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.316823 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0"} err="failed to get container status \"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\": rpc error: code = NotFound desc = could not find container \"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\": container with ID starting with 12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.316841 4697 scope.go:117] "RemoveContainer" containerID="c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.317081 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63"} err="failed to get container status \"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\": rpc error: code = NotFound desc = could not find container \"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\": container with ID starting with c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.317105 4697 scope.go:117] "RemoveContainer" containerID="76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.317320 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a"} err="failed to get container status \"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\": rpc error: code = NotFound desc = could not find container \"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\": container with ID starting with 76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.317345 4697 scope.go:117] "RemoveContainer" containerID="ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.317595 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b"} err="failed to get container status \"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\": rpc error: code = NotFound desc = could not find container \"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\": container with ID starting with ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.317615 4697 scope.go:117] "RemoveContainer" containerID="0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.317805 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725"} err="failed to get container status \"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\": rpc error: code = NotFound desc = could not find container \"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\": container with ID starting with 0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.317826 4697 scope.go:117] "RemoveContainer" containerID="38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.318057 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00"} err="failed to get container status \"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\": rpc error: code = NotFound desc = could not find container \"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\": container with ID starting with 38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.318081 4697 scope.go:117] "RemoveContainer" containerID="c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.318295 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c"} err="failed to get container status \"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\": rpc error: code = NotFound desc = could not find container \"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\": container with ID starting with c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.318311 4697 scope.go:117] "RemoveContainer" containerID="2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.318520 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f"} err="failed to get container status \"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f\": rpc error: code = NotFound desc = could not find container \"2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f\": container with ID starting with 2e56aaa2a8f227e56494a32fc28d6d09f534d4c97a29125851370ce08928224f not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.318541 4697 scope.go:117] "RemoveContainer" containerID="1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.318774 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41"} err="failed to get container status \"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\": rpc error: code = NotFound desc = could not find container \"1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41\": container with ID starting with 1e8ba7d9bc394dbb53ef47e42b33831dae727dfcc2f68caca4f7ad8526799e41 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.318795 4697 scope.go:117] "RemoveContainer" containerID="e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.319039 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff"} err="failed to get container status \"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\": rpc error: code = NotFound desc = could not find container \"e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff\": container with ID starting with e0bac2eca4184d20759dbe6de84b33d96eaf0fa532b8736a431cb4f97b177bff not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.319058 4697 scope.go:117] "RemoveContainer" containerID="12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.319279 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0"} err="failed to get container status \"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\": rpc error: code = NotFound desc = could not find container \"12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0\": container with ID starting with 12136d4d9e6796966fb4b4e5a960bc6c052ca771b2b6c145bd495e6ffa3982d0 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.319301 4697 scope.go:117] "RemoveContainer" containerID="c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.319569 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63"} err="failed to get container status \"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\": rpc error: code = NotFound desc = could not find container \"c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63\": container with ID starting with c139f8110b209afb5e3be5eef47b8dd6c9962a82c296e26febf97e671c01ab63 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.319591 4697 scope.go:117] "RemoveContainer" containerID="76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.319795 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a"} err="failed to get container status \"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\": rpc error: code = NotFound desc = could not find container \"76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a\": container with ID starting with 76f8a689b2faf77107a347779397fa6ae14309714be181895a30f675565ec91a not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.319813 4697 scope.go:117] "RemoveContainer" containerID="ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.320064 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b"} err="failed to get container status \"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\": rpc error: code = NotFound desc = could not find container \"ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b\": container with ID starting with ed29b0cbb0584ad127cdfc7b20fe43d7ee1735f8b312d931dfffd43e584f9b5b not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.320084 4697 scope.go:117] "RemoveContainer" containerID="0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.320288 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725"} err="failed to get container status \"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\": rpc error: code = NotFound desc = could not find container \"0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725\": container with ID starting with 0716767b98948f5cce3879654b7a6dc6da92d339dabd46a753eb61e8d91e9725 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.320309 4697 scope.go:117] "RemoveContainer" containerID="38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.320635 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00"} err="failed to get container status \"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\": rpc error: code = NotFound desc = could not find container \"38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00\": container with ID starting with 38c6aa47c83be3f69d43aeaa83f8b85653fa4ae1efd583acdf9ee0c00b176a00 not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.320658 4697 scope.go:117] "RemoveContainer" containerID="c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.320902 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c"} err="failed to get container status \"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\": rpc error: code = NotFound desc = could not find container \"c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c\": container with ID starting with c7e0903d1380aebd693093db8b5ace585390b279b1a8d35b0cd86ab8d91e5a6c not found: ID does not exist" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.411164 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.518850 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9zpdc"] Feb 20 16:43:27 crc kubenswrapper[4697]: I0220 16:43:27.523809 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9zpdc"] Feb 20 16:43:28 crc kubenswrapper[4697]: I0220 16:43:28.160834 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lrpxf_1de5dc4e-ef42-48fc-be23-eaec2039c031/kube-multus/2.log" Feb 20 16:43:28 crc kubenswrapper[4697]: I0220 16:43:28.162844 4697 generic.go:334] "Generic (PLEG): container finished" podID="6d2edaa2-a962-40a6-8cfc-66236ebf7a69" containerID="4db1aeef90692b2c5537258ff1ffb0de48d50b2c4c6ede8a05bb6ac0b714ed07" exitCode=0 Feb 20 16:43:28 crc kubenswrapper[4697]: I0220 16:43:28.162880 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" event={"ID":"6d2edaa2-a962-40a6-8cfc-66236ebf7a69","Type":"ContainerDied","Data":"4db1aeef90692b2c5537258ff1ffb0de48d50b2c4c6ede8a05bb6ac0b714ed07"} Feb 20 16:43:28 crc kubenswrapper[4697]: I0220 16:43:28.162904 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" event={"ID":"6d2edaa2-a962-40a6-8cfc-66236ebf7a69","Type":"ContainerStarted","Data":"731d8ad07ac6d02fbd508178dc8c5a72cddcd52c089457586f495977e6969209"} Feb 20 16:43:28 crc kubenswrapper[4697]: I0220 16:43:28.882431 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99eb233c-7094-4a86-ab37-0b160001bbef" path="/var/lib/kubelet/pods/99eb233c-7094-4a86-ab37-0b160001bbef/volumes" Feb 20 16:43:29 crc kubenswrapper[4697]: I0220 16:43:29.172706 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" event={"ID":"6d2edaa2-a962-40a6-8cfc-66236ebf7a69","Type":"ContainerStarted","Data":"090781307d34742f46d472d7fe3bdc2d31cd03a99c41d7b8482a2ed2c956b8f4"} Feb 20 16:43:29 crc kubenswrapper[4697]: I0220 16:43:29.172750 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" event={"ID":"6d2edaa2-a962-40a6-8cfc-66236ebf7a69","Type":"ContainerStarted","Data":"dc0145a8772a637151b03fb3e08ef8027d379a3147989dd499ec4d41447d714e"} Feb 20 16:43:29 crc kubenswrapper[4697]: I0220 16:43:29.172762 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" event={"ID":"6d2edaa2-a962-40a6-8cfc-66236ebf7a69","Type":"ContainerStarted","Data":"a47ca5ccb0539795480ff44e47d3237b1489d9aa533160245940e9f957edae04"} Feb 20 16:43:29 crc kubenswrapper[4697]: I0220 16:43:29.172772 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" event={"ID":"6d2edaa2-a962-40a6-8cfc-66236ebf7a69","Type":"ContainerStarted","Data":"ff968d689b1bdef76f4ece4f8728c94b7a57a330cabe819cc9b909b461b0e9fd"} Feb 20 16:43:29 crc kubenswrapper[4697]: I0220 16:43:29.172779 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" event={"ID":"6d2edaa2-a962-40a6-8cfc-66236ebf7a69","Type":"ContainerStarted","Data":"d8ed09976ffb1bfc356ad5e2fc0277bbde00944b215ee3fb0961dd4a44382521"} Feb 20 16:43:29 crc kubenswrapper[4697]: I0220 16:43:29.172786 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" event={"ID":"6d2edaa2-a962-40a6-8cfc-66236ebf7a69","Type":"ContainerStarted","Data":"e25b6350d633b15eec45e9dc46f0f16a1bffc6b35d1035a4a44923759830f720"} Feb 20 16:43:31 crc kubenswrapper[4697]: I0220 16:43:31.184962 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:43:31 crc kubenswrapper[4697]: I0220 16:43:31.186248 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:43:31 crc kubenswrapper[4697]: I0220 16:43:31.192629 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" event={"ID":"6d2edaa2-a962-40a6-8cfc-66236ebf7a69","Type":"ContainerStarted","Data":"cdc29be4fdf0254db707ac8487a61a5ab975cb6bea01192134889abbfcccdb82"} Feb 20 16:43:33 crc kubenswrapper[4697]: I0220 16:43:33.142116 4697 scope.go:117] "RemoveContainer" containerID="8ad2027f9675a760cb53da06f4a0fae8cc318213f9c6cdfad85ca9da18b4ad30" Feb 20 16:43:34 crc kubenswrapper[4697]: I0220 16:43:34.215422 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" event={"ID":"6d2edaa2-a962-40a6-8cfc-66236ebf7a69","Type":"ContainerStarted","Data":"44681aa49735e16ae8ce8191e1a97b377ba2e422f2d7c6166a8d97bf6f3aa574"} Feb 20 16:43:34 crc kubenswrapper[4697]: I0220 16:43:34.215820 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:34 crc kubenswrapper[4697]: I0220 16:43:34.244368 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" podStartSLOduration=7.24434964 podStartE2EDuration="7.24434964s" podCreationTimestamp="2026-02-20 16:43:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:43:34.241572131 +0000 UTC m=+722.021617539" watchObservedRunningTime="2026-02-20 16:43:34.24434964 +0000 UTC m=+722.024395048" Feb 20 16:43:34 crc kubenswrapper[4697]: I0220 16:43:34.254471 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:35 crc kubenswrapper[4697]: I0220 16:43:35.231302 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:35 crc kubenswrapper[4697]: I0220 16:43:35.231355 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:35 crc kubenswrapper[4697]: I0220 16:43:35.278846 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:43:36 crc kubenswrapper[4697]: I0220 16:43:36.933190 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh"] Feb 20 16:43:36 crc kubenswrapper[4697]: I0220 16:43:36.937266 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:36 crc kubenswrapper[4697]: I0220 16:43:36.940504 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 16:43:36 crc kubenswrapper[4697]: I0220 16:43:36.941000 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh"] Feb 20 16:43:37 crc kubenswrapper[4697]: I0220 16:43:37.012793 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9412f48-6077-4f90-8d2d-869512ab617d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh\" (UID: \"c9412f48-6077-4f90-8d2d-869512ab617d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:37 crc kubenswrapper[4697]: I0220 16:43:37.012838 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9412f48-6077-4f90-8d2d-869512ab617d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh\" (UID: \"c9412f48-6077-4f90-8d2d-869512ab617d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:37 crc kubenswrapper[4697]: I0220 16:43:37.012931 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk8sn\" (UniqueName: \"kubernetes.io/projected/c9412f48-6077-4f90-8d2d-869512ab617d-kube-api-access-pk8sn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh\" (UID: \"c9412f48-6077-4f90-8d2d-869512ab617d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:37 crc kubenswrapper[4697]: I0220 16:43:37.114255 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk8sn\" (UniqueName: \"kubernetes.io/projected/c9412f48-6077-4f90-8d2d-869512ab617d-kube-api-access-pk8sn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh\" (UID: \"c9412f48-6077-4f90-8d2d-869512ab617d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:37 crc kubenswrapper[4697]: I0220 16:43:37.114329 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9412f48-6077-4f90-8d2d-869512ab617d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh\" (UID: \"c9412f48-6077-4f90-8d2d-869512ab617d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:37 crc kubenswrapper[4697]: I0220 16:43:37.114356 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9412f48-6077-4f90-8d2d-869512ab617d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh\" (UID: \"c9412f48-6077-4f90-8d2d-869512ab617d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:37 crc kubenswrapper[4697]: I0220 16:43:37.114922 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9412f48-6077-4f90-8d2d-869512ab617d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh\" (UID: \"c9412f48-6077-4f90-8d2d-869512ab617d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:37 crc kubenswrapper[4697]: I0220 16:43:37.114961 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9412f48-6077-4f90-8d2d-869512ab617d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh\" (UID: \"c9412f48-6077-4f90-8d2d-869512ab617d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:37 crc kubenswrapper[4697]: I0220 16:43:37.139883 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk8sn\" (UniqueName: \"kubernetes.io/projected/c9412f48-6077-4f90-8d2d-869512ab617d-kube-api-access-pk8sn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh\" (UID: \"c9412f48-6077-4f90-8d2d-869512ab617d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:37 crc kubenswrapper[4697]: I0220 16:43:37.254752 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:37 crc kubenswrapper[4697]: E0220 16:43:37.295573 4697 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace_c9412f48-6077-4f90-8d2d-869512ab617d_0(cfb09ff15b5a893c0b7138eeddff543c5af37351808c8cbf194b58a43cbfb947): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 16:43:37 crc kubenswrapper[4697]: E0220 16:43:37.295677 4697 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace_c9412f48-6077-4f90-8d2d-869512ab617d_0(cfb09ff15b5a893c0b7138eeddff543c5af37351808c8cbf194b58a43cbfb947): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:37 crc kubenswrapper[4697]: E0220 16:43:37.295727 4697 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace_c9412f48-6077-4f90-8d2d-869512ab617d_0(cfb09ff15b5a893c0b7138eeddff543c5af37351808c8cbf194b58a43cbfb947): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:37 crc kubenswrapper[4697]: E0220 16:43:37.295821 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace(c9412f48-6077-4f90-8d2d-869512ab617d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace(c9412f48-6077-4f90-8d2d-869512ab617d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace_c9412f48-6077-4f90-8d2d-869512ab617d_0(cfb09ff15b5a893c0b7138eeddff543c5af37351808c8cbf194b58a43cbfb947): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" podUID="c9412f48-6077-4f90-8d2d-869512ab617d" Feb 20 16:43:38 crc kubenswrapper[4697]: I0220 16:43:38.244629 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:38 crc kubenswrapper[4697]: I0220 16:43:38.245335 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:38 crc kubenswrapper[4697]: E0220 16:43:38.265186 4697 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace_c9412f48-6077-4f90-8d2d-869512ab617d_0(1fd80faae8559970d66de8f04b6d16a03d6f73d0134552e677dde1f489a390bc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 16:43:38 crc kubenswrapper[4697]: E0220 16:43:38.265274 4697 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace_c9412f48-6077-4f90-8d2d-869512ab617d_0(1fd80faae8559970d66de8f04b6d16a03d6f73d0134552e677dde1f489a390bc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:38 crc kubenswrapper[4697]: E0220 16:43:38.265309 4697 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace_c9412f48-6077-4f90-8d2d-869512ab617d_0(1fd80faae8559970d66de8f04b6d16a03d6f73d0134552e677dde1f489a390bc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:38 crc kubenswrapper[4697]: E0220 16:43:38.265389 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace(c9412f48-6077-4f90-8d2d-869512ab617d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace(c9412f48-6077-4f90-8d2d-869512ab617d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace_c9412f48-6077-4f90-8d2d-869512ab617d_0(1fd80faae8559970d66de8f04b6d16a03d6f73d0134552e677dde1f489a390bc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" podUID="c9412f48-6077-4f90-8d2d-869512ab617d" Feb 20 16:43:40 crc kubenswrapper[4697]: I0220 16:43:40.877474 4697 scope.go:117] "RemoveContainer" containerID="3c189a1fdd8a35950990c7aaff7044115c85864154e3618a32b8b7eaf68d188d" Feb 20 16:43:40 crc kubenswrapper[4697]: E0220 16:43:40.877868 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-lrpxf_openshift-multus(1de5dc4e-ef42-48fc-be23-eaec2039c031)\"" pod="openshift-multus/multus-lrpxf" podUID="1de5dc4e-ef42-48fc-be23-eaec2039c031" Feb 20 16:43:49 crc kubenswrapper[4697]: I0220 16:43:49.876539 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:49 crc kubenswrapper[4697]: I0220 16:43:49.877380 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:49 crc kubenswrapper[4697]: E0220 16:43:49.908054 4697 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace_c9412f48-6077-4f90-8d2d-869512ab617d_0(289063f38cdf58f6f4efea22d77cbb51981257f02f00ef3a697e71fb807e6d46): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 16:43:49 crc kubenswrapper[4697]: E0220 16:43:49.908122 4697 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace_c9412f48-6077-4f90-8d2d-869512ab617d_0(289063f38cdf58f6f4efea22d77cbb51981257f02f00ef3a697e71fb807e6d46): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:49 crc kubenswrapper[4697]: E0220 16:43:49.908152 4697 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace_c9412f48-6077-4f90-8d2d-869512ab617d_0(289063f38cdf58f6f4efea22d77cbb51981257f02f00ef3a697e71fb807e6d46): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:43:49 crc kubenswrapper[4697]: E0220 16:43:49.908213 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace(c9412f48-6077-4f90-8d2d-869512ab617d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace(c9412f48-6077-4f90-8d2d-869512ab617d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_openshift-marketplace_c9412f48-6077-4f90-8d2d-869512ab617d_0(289063f38cdf58f6f4efea22d77cbb51981257f02f00ef3a697e71fb807e6d46): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" podUID="c9412f48-6077-4f90-8d2d-869512ab617d" Feb 20 16:43:51 crc kubenswrapper[4697]: I0220 16:43:51.877413 4697 scope.go:117] "RemoveContainer" containerID="3c189a1fdd8a35950990c7aaff7044115c85864154e3618a32b8b7eaf68d188d" Feb 20 16:43:52 crc kubenswrapper[4697]: I0220 16:43:52.332220 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lrpxf_1de5dc4e-ef42-48fc-be23-eaec2039c031/kube-multus/2.log" Feb 20 16:43:52 crc kubenswrapper[4697]: I0220 16:43:52.332567 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lrpxf" event={"ID":"1de5dc4e-ef42-48fc-be23-eaec2039c031","Type":"ContainerStarted","Data":"987fccba8a36183748157175b24fc05455d06094f4f8c33db2a58db6f7285caf"} Feb 20 16:43:57 crc kubenswrapper[4697]: I0220 16:43:57.493618 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-26bt5" Feb 20 16:44:01 crc kubenswrapper[4697]: I0220 16:44:01.184971 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:44:01 crc kubenswrapper[4697]: I0220 16:44:01.185613 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:44:01 crc kubenswrapper[4697]: I0220 16:44:01.185697 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:44:01 crc kubenswrapper[4697]: I0220 16:44:01.186793 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a39e2b324782ff79e96c097bdf5a12c5992709cf28743465f5ab68009a413113"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 16:44:01 crc kubenswrapper[4697]: I0220 16:44:01.186921 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://a39e2b324782ff79e96c097bdf5a12c5992709cf28743465f5ab68009a413113" gracePeriod=600 Feb 20 16:44:01 crc kubenswrapper[4697]: I0220 16:44:01.388915 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="a39e2b324782ff79e96c097bdf5a12c5992709cf28743465f5ab68009a413113" exitCode=0 Feb 20 16:44:01 crc kubenswrapper[4697]: I0220 16:44:01.389012 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"a39e2b324782ff79e96c097bdf5a12c5992709cf28743465f5ab68009a413113"} Feb 20 16:44:01 crc kubenswrapper[4697]: I0220 16:44:01.389307 4697 scope.go:117] "RemoveContainer" containerID="37b12a08456eb67b5389b54c7c7590b240888403dd45128a2dd330327f7ef7cb" Feb 20 16:44:02 crc kubenswrapper[4697]: I0220 16:44:02.397795 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"ac1ff61636e81a13d334b99986c31ee9bcf221f2d7263a9112ad988ea78c70f4"} Feb 20 16:44:04 crc kubenswrapper[4697]: I0220 16:44:04.876829 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:44:04 crc kubenswrapper[4697]: I0220 16:44:04.877949 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:44:05 crc kubenswrapper[4697]: I0220 16:44:05.304566 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh"] Feb 20 16:44:05 crc kubenswrapper[4697]: I0220 16:44:05.417482 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" event={"ID":"c9412f48-6077-4f90-8d2d-869512ab617d","Type":"ContainerStarted","Data":"90f56c4ec9e594394c2664a249c1451f12fa1e1a1c601d26cab6e083ff952b25"} Feb 20 16:44:06 crc kubenswrapper[4697]: I0220 16:44:06.424193 4697 generic.go:334] "Generic (PLEG): container finished" podID="c9412f48-6077-4f90-8d2d-869512ab617d" containerID="d86f4552e03b1115bfdd546087744a9c48d7ed316018fb61522d67a6b2a829d3" exitCode=0 Feb 20 16:44:06 crc kubenswrapper[4697]: I0220 16:44:06.424232 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" event={"ID":"c9412f48-6077-4f90-8d2d-869512ab617d","Type":"ContainerDied","Data":"d86f4552e03b1115bfdd546087744a9c48d7ed316018fb61522d67a6b2a829d3"} Feb 20 16:44:08 crc kubenswrapper[4697]: I0220 16:44:08.436781 4697 generic.go:334] "Generic (PLEG): container finished" podID="c9412f48-6077-4f90-8d2d-869512ab617d" containerID="869f7467f4f30f527333c23b62ab74de22c556f4c6e2ea19f0dcc4223197a8aa" exitCode=0 Feb 20 16:44:08 crc kubenswrapper[4697]: I0220 16:44:08.436885 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" event={"ID":"c9412f48-6077-4f90-8d2d-869512ab617d","Type":"ContainerDied","Data":"869f7467f4f30f527333c23b62ab74de22c556f4c6e2ea19f0dcc4223197a8aa"} Feb 20 16:44:09 crc kubenswrapper[4697]: I0220 16:44:09.445463 4697 generic.go:334] "Generic (PLEG): container finished" podID="c9412f48-6077-4f90-8d2d-869512ab617d" containerID="68e11cc98a3b6661234c566b2f33db7264f20153cb885a23a5b462c00fe8d559" exitCode=0 Feb 20 16:44:09 crc kubenswrapper[4697]: I0220 16:44:09.445522 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" event={"ID":"c9412f48-6077-4f90-8d2d-869512ab617d","Type":"ContainerDied","Data":"68e11cc98a3b6661234c566b2f33db7264f20153cb885a23a5b462c00fe8d559"} Feb 20 16:44:10 crc kubenswrapper[4697]: I0220 16:44:10.661313 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:44:10 crc kubenswrapper[4697]: I0220 16:44:10.844224 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9412f48-6077-4f90-8d2d-869512ab617d-util\") pod \"c9412f48-6077-4f90-8d2d-869512ab617d\" (UID: \"c9412f48-6077-4f90-8d2d-869512ab617d\") " Feb 20 16:44:10 crc kubenswrapper[4697]: I0220 16:44:10.844270 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9412f48-6077-4f90-8d2d-869512ab617d-bundle\") pod \"c9412f48-6077-4f90-8d2d-869512ab617d\" (UID: \"c9412f48-6077-4f90-8d2d-869512ab617d\") " Feb 20 16:44:10 crc kubenswrapper[4697]: I0220 16:44:10.844452 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk8sn\" (UniqueName: \"kubernetes.io/projected/c9412f48-6077-4f90-8d2d-869512ab617d-kube-api-access-pk8sn\") pod \"c9412f48-6077-4f90-8d2d-869512ab617d\" (UID: \"c9412f48-6077-4f90-8d2d-869512ab617d\") " Feb 20 16:44:10 crc kubenswrapper[4697]: I0220 16:44:10.847866 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9412f48-6077-4f90-8d2d-869512ab617d-bundle" (OuterVolumeSpecName: "bundle") pod "c9412f48-6077-4f90-8d2d-869512ab617d" (UID: "c9412f48-6077-4f90-8d2d-869512ab617d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:44:10 crc kubenswrapper[4697]: I0220 16:44:10.852566 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9412f48-6077-4f90-8d2d-869512ab617d-kube-api-access-pk8sn" (OuterVolumeSpecName: "kube-api-access-pk8sn") pod "c9412f48-6077-4f90-8d2d-869512ab617d" (UID: "c9412f48-6077-4f90-8d2d-869512ab617d"). InnerVolumeSpecName "kube-api-access-pk8sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:44:10 crc kubenswrapper[4697]: I0220 16:44:10.949014 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk8sn\" (UniqueName: \"kubernetes.io/projected/c9412f48-6077-4f90-8d2d-869512ab617d-kube-api-access-pk8sn\") on node \"crc\" DevicePath \"\"" Feb 20 16:44:10 crc kubenswrapper[4697]: I0220 16:44:10.949087 4697 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9412f48-6077-4f90-8d2d-869512ab617d-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:44:10 crc kubenswrapper[4697]: I0220 16:44:10.985722 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9412f48-6077-4f90-8d2d-869512ab617d-util" (OuterVolumeSpecName: "util") pod "c9412f48-6077-4f90-8d2d-869512ab617d" (UID: "c9412f48-6077-4f90-8d2d-869512ab617d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:44:11 crc kubenswrapper[4697]: I0220 16:44:11.050352 4697 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9412f48-6077-4f90-8d2d-869512ab617d-util\") on node \"crc\" DevicePath \"\"" Feb 20 16:44:11 crc kubenswrapper[4697]: I0220 16:44:11.462880 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" Feb 20 16:44:11 crc kubenswrapper[4697]: I0220 16:44:11.463568 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh" event={"ID":"c9412f48-6077-4f90-8d2d-869512ab617d","Type":"ContainerDied","Data":"90f56c4ec9e594394c2664a249c1451f12fa1e1a1c601d26cab6e083ff952b25"} Feb 20 16:44:11 crc kubenswrapper[4697]: I0220 16:44:11.463649 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90f56c4ec9e594394c2664a249c1451f12fa1e1a1c601d26cab6e083ff952b25" Feb 20 16:44:17 crc kubenswrapper[4697]: I0220 16:44:17.427552 4697 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.172981 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-72v4q"] Feb 20 16:44:19 crc kubenswrapper[4697]: E0220 16:44:19.173468 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9412f48-6077-4f90-8d2d-869512ab617d" containerName="extract" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.173481 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9412f48-6077-4f90-8d2d-869512ab617d" containerName="extract" Feb 20 16:44:19 crc kubenswrapper[4697]: E0220 16:44:19.173500 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9412f48-6077-4f90-8d2d-869512ab617d" containerName="util" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.173508 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9412f48-6077-4f90-8d2d-869512ab617d" containerName="util" Feb 20 16:44:19 crc kubenswrapper[4697]: E0220 16:44:19.173523 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9412f48-6077-4f90-8d2d-869512ab617d" containerName="pull" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.173530 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9412f48-6077-4f90-8d2d-869512ab617d" containerName="pull" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.173634 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9412f48-6077-4f90-8d2d-869512ab617d" containerName="extract" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.174499 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.224790 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-72v4q"] Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.249320 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-utilities\") pod \"redhat-operators-72v4q\" (UID: \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\") " pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.249357 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-catalog-content\") pod \"redhat-operators-72v4q\" (UID: \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\") " pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.249494 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm2rf\" (UniqueName: \"kubernetes.io/projected/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-kube-api-access-cm2rf\") pod \"redhat-operators-72v4q\" (UID: \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\") " pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.350877 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-utilities\") pod \"redhat-operators-72v4q\" (UID: \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\") " pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.350919 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-catalog-content\") pod \"redhat-operators-72v4q\" (UID: \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\") " pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.350943 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm2rf\" (UniqueName: \"kubernetes.io/projected/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-kube-api-access-cm2rf\") pod \"redhat-operators-72v4q\" (UID: \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\") " pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.351496 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-utilities\") pod \"redhat-operators-72v4q\" (UID: \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\") " pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.351507 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-catalog-content\") pod \"redhat-operators-72v4q\" (UID: \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\") " pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.370358 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm2rf\" (UniqueName: \"kubernetes.io/projected/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-kube-api-access-cm2rf\") pod \"redhat-operators-72v4q\" (UID: \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\") " pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.493524 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:19 crc kubenswrapper[4697]: I0220 16:44:19.728607 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-72v4q"] Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.109774 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-cqddv"] Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.110687 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cqddv" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.112532 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rhpff" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.112568 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.112755 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.122613 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-cqddv"] Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.160178 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgwlr\" (UniqueName: \"kubernetes.io/projected/214e6680-a3c8-4286-9c91-d68893ba73be-kube-api-access-vgwlr\") pod \"obo-prometheus-operator-68bc856cb9-cqddv\" (UID: \"214e6680-a3c8-4286-9c91-d68893ba73be\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cqddv" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.262014 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5"] Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.262017 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgwlr\" (UniqueName: \"kubernetes.io/projected/214e6680-a3c8-4286-9c91-d68893ba73be-kube-api-access-vgwlr\") pod \"obo-prometheus-operator-68bc856cb9-cqddv\" (UID: \"214e6680-a3c8-4286-9c91-d68893ba73be\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cqddv" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.262612 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5" Feb 20 16:44:20 crc kubenswrapper[4697]: W0220 16:44:20.265068 4697 reflector.go:561] object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-t4v6c": failed to list *v1.Secret: secrets "obo-prometheus-operator-admission-webhook-dockercfg-t4v6c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Feb 20 16:44:20 crc kubenswrapper[4697]: E0220 16:44:20.265113 4697 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"obo-prometheus-operator-admission-webhook-dockercfg-t4v6c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"obo-prometheus-operator-admission-webhook-dockercfg-t4v6c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.266374 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.281930 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw"] Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.282742 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.290247 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5"] Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.304745 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgwlr\" (UniqueName: \"kubernetes.io/projected/214e6680-a3c8-4286-9c91-d68893ba73be-kube-api-access-vgwlr\") pod \"obo-prometheus-operator-68bc856cb9-cqddv\" (UID: \"214e6680-a3c8-4286-9c91-d68893ba73be\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cqddv" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.307860 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw"] Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.363186 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/68c668ba-cbcc-4330-95e2-012c78108925-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8679bd8497-phkqw\" (UID: \"68c668ba-cbcc-4330-95e2-012c78108925\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.363229 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/565bbe20-d8eb-4878-a048-4d78d8123f6d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5\" (UID: \"565bbe20-d8eb-4878-a048-4d78d8123f6d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.363262 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/68c668ba-cbcc-4330-95e2-012c78108925-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8679bd8497-phkqw\" (UID: \"68c668ba-cbcc-4330-95e2-012c78108925\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.363283 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/565bbe20-d8eb-4878-a048-4d78d8123f6d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5\" (UID: \"565bbe20-d8eb-4878-a048-4d78d8123f6d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.423750 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cqddv" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.437647 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8wmf5"] Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.438259 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.440517 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-2g25m" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.440550 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.449957 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8wmf5"] Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.464255 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/68c668ba-cbcc-4330-95e2-012c78108925-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8679bd8497-phkqw\" (UID: \"68c668ba-cbcc-4330-95e2-012c78108925\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.464955 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/565bbe20-d8eb-4878-a048-4d78d8123f6d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5\" (UID: \"565bbe20-d8eb-4878-a048-4d78d8123f6d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.464994 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxtwm\" (UniqueName: \"kubernetes.io/projected/12db7fd9-ce39-43cf-99b7-3a56791c0390-kube-api-access-rxtwm\") pod \"observability-operator-59bdc8b94-8wmf5\" (UID: \"12db7fd9-ce39-43cf-99b7-3a56791c0390\") " pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.465027 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/68c668ba-cbcc-4330-95e2-012c78108925-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8679bd8497-phkqw\" (UID: \"68c668ba-cbcc-4330-95e2-012c78108925\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.465044 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/12db7fd9-ce39-43cf-99b7-3a56791c0390-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8wmf5\" (UID: \"12db7fd9-ce39-43cf-99b7-3a56791c0390\") " pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.465065 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/565bbe20-d8eb-4878-a048-4d78d8123f6d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5\" (UID: \"565bbe20-d8eb-4878-a048-4d78d8123f6d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.467943 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/68c668ba-cbcc-4330-95e2-012c78108925-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8679bd8497-phkqw\" (UID: \"68c668ba-cbcc-4330-95e2-012c78108925\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.468047 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/565bbe20-d8eb-4878-a048-4d78d8123f6d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5\" (UID: \"565bbe20-d8eb-4878-a048-4d78d8123f6d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.468060 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/565bbe20-d8eb-4878-a048-4d78d8123f6d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5\" (UID: \"565bbe20-d8eb-4878-a048-4d78d8123f6d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.480942 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/68c668ba-cbcc-4330-95e2-012c78108925-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8679bd8497-phkqw\" (UID: \"68c668ba-cbcc-4330-95e2-012c78108925\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.506655 4697 generic.go:334] "Generic (PLEG): container finished" podID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" containerID="585ce33744decb9beda974b3f1b5b54e8b12b09782ca8f142b090a529db44959" exitCode=0 Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.506880 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72v4q" event={"ID":"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f","Type":"ContainerDied","Data":"585ce33744decb9beda974b3f1b5b54e8b12b09782ca8f142b090a529db44959"} Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.506904 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72v4q" event={"ID":"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f","Type":"ContainerStarted","Data":"adc38b2830b3e96d049dec65fa9e99a6563ccb408ec69c613f3cc6a5ac82cb6b"} Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.571084 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxtwm\" (UniqueName: \"kubernetes.io/projected/12db7fd9-ce39-43cf-99b7-3a56791c0390-kube-api-access-rxtwm\") pod \"observability-operator-59bdc8b94-8wmf5\" (UID: \"12db7fd9-ce39-43cf-99b7-3a56791c0390\") " pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.571136 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/12db7fd9-ce39-43cf-99b7-3a56791c0390-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8wmf5\" (UID: \"12db7fd9-ce39-43cf-99b7-3a56791c0390\") " pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.574408 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/12db7fd9-ce39-43cf-99b7-3a56791c0390-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8wmf5\" (UID: \"12db7fd9-ce39-43cf-99b7-3a56791c0390\") " pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.595134 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxtwm\" (UniqueName: \"kubernetes.io/projected/12db7fd9-ce39-43cf-99b7-3a56791c0390-kube-api-access-rxtwm\") pod \"observability-operator-59bdc8b94-8wmf5\" (UID: \"12db7fd9-ce39-43cf-99b7-3a56791c0390\") " pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.664824 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-j5gtw"] Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.665492 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.670227 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-gtvp5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.686756 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-j5gtw"] Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.715369 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-cqddv"] Feb 20 16:44:20 crc kubenswrapper[4697]: W0220 16:44:20.720011 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod214e6680_a3c8_4286_9c91_d68893ba73be.slice/crio-821383087c502cf5227c9b724bc0c0c578f7c1e826d28cf02a7e1724a39828d1 WatchSource:0}: Error finding container 821383087c502cf5227c9b724bc0c0c578f7c1e826d28cf02a7e1724a39828d1: Status 404 returned error can't find the container with id 821383087c502cf5227c9b724bc0c0c578f7c1e826d28cf02a7e1724a39828d1 Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.773906 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qpvq\" (UniqueName: \"kubernetes.io/projected/94f7dd38-b255-46cd-8b05-4c720857dd86-kube-api-access-4qpvq\") pod \"perses-operator-5bf474d74f-j5gtw\" (UID: \"94f7dd38-b255-46cd-8b05-4c720857dd86\") " pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.773966 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/94f7dd38-b255-46cd-8b05-4c720857dd86-openshift-service-ca\") pod \"perses-operator-5bf474d74f-j5gtw\" (UID: \"94f7dd38-b255-46cd-8b05-4c720857dd86\") " pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.812901 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.874478 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qpvq\" (UniqueName: \"kubernetes.io/projected/94f7dd38-b255-46cd-8b05-4c720857dd86-kube-api-access-4qpvq\") pod \"perses-operator-5bf474d74f-j5gtw\" (UID: \"94f7dd38-b255-46cd-8b05-4c720857dd86\") " pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.874523 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/94f7dd38-b255-46cd-8b05-4c720857dd86-openshift-service-ca\") pod \"perses-operator-5bf474d74f-j5gtw\" (UID: \"94f7dd38-b255-46cd-8b05-4c720857dd86\") " pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.875413 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/94f7dd38-b255-46cd-8b05-4c720857dd86-openshift-service-ca\") pod \"perses-operator-5bf474d74f-j5gtw\" (UID: \"94f7dd38-b255-46cd-8b05-4c720857dd86\") " pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" Feb 20 16:44:20 crc kubenswrapper[4697]: I0220 16:44:20.910448 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qpvq\" (UniqueName: \"kubernetes.io/projected/94f7dd38-b255-46cd-8b05-4c720857dd86-kube-api-access-4qpvq\") pod \"perses-operator-5bf474d74f-j5gtw\" (UID: \"94f7dd38-b255-46cd-8b05-4c720857dd86\") " pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" Feb 20 16:44:21 crc kubenswrapper[4697]: I0220 16:44:21.001573 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" Feb 20 16:44:21 crc kubenswrapper[4697]: I0220 16:44:21.082484 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8wmf5"] Feb 20 16:44:21 crc kubenswrapper[4697]: I0220 16:44:21.473462 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-j5gtw"] Feb 20 16:44:21 crc kubenswrapper[4697]: I0220 16:44:21.494238 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-t4v6c" Feb 20 16:44:21 crc kubenswrapper[4697]: I0220 16:44:21.494883 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5" Feb 20 16:44:21 crc kubenswrapper[4697]: I0220 16:44:21.495468 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw" Feb 20 16:44:21 crc kubenswrapper[4697]: I0220 16:44:21.520118 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72v4q" event={"ID":"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f","Type":"ContainerStarted","Data":"03216480f0cbf008beffc5f44fd741b6551ebf7246f8e25cc36a9023bccc4313"} Feb 20 16:44:21 crc kubenswrapper[4697]: I0220 16:44:21.534883 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" event={"ID":"94f7dd38-b255-46cd-8b05-4c720857dd86","Type":"ContainerStarted","Data":"ea9b868300a7e8c6d29aedd6aa7b8cf12831af52797d8ab7f565bad9b1765729"} Feb 20 16:44:21 crc kubenswrapper[4697]: I0220 16:44:21.554670 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" event={"ID":"12db7fd9-ce39-43cf-99b7-3a56791c0390","Type":"ContainerStarted","Data":"ad04be9c756020277e8eeb71ebd4c437401128e037da0ca07469a86ccc927df3"} Feb 20 16:44:21 crc kubenswrapper[4697]: I0220 16:44:21.573653 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cqddv" event={"ID":"214e6680-a3c8-4286-9c91-d68893ba73be","Type":"ContainerStarted","Data":"821383087c502cf5227c9b724bc0c0c578f7c1e826d28cf02a7e1724a39828d1"} Feb 20 16:44:21 crc kubenswrapper[4697]: I0220 16:44:21.846818 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw"] Feb 20 16:44:22 crc kubenswrapper[4697]: I0220 16:44:22.142170 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5"] Feb 20 16:44:22 crc kubenswrapper[4697]: W0220 16:44:22.180200 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod565bbe20_d8eb_4878_a048_4d78d8123f6d.slice/crio-deafef6452032e41b8f43390e79bae8d9d7edd4896167812de8ae7ac1f4e8c3b WatchSource:0}: Error finding container deafef6452032e41b8f43390e79bae8d9d7edd4896167812de8ae7ac1f4e8c3b: Status 404 returned error can't find the container with id deafef6452032e41b8f43390e79bae8d9d7edd4896167812de8ae7ac1f4e8c3b Feb 20 16:44:22 crc kubenswrapper[4697]: I0220 16:44:22.580454 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw" event={"ID":"68c668ba-cbcc-4330-95e2-012c78108925","Type":"ContainerStarted","Data":"720d2a4efd25e860d5d5dbbf43e4a0ed6fc56905b6afedc3ce376d58d61c2f62"} Feb 20 16:44:22 crc kubenswrapper[4697]: I0220 16:44:22.582289 4697 generic.go:334] "Generic (PLEG): container finished" podID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" containerID="03216480f0cbf008beffc5f44fd741b6551ebf7246f8e25cc36a9023bccc4313" exitCode=0 Feb 20 16:44:22 crc kubenswrapper[4697]: I0220 16:44:22.582355 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72v4q" event={"ID":"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f","Type":"ContainerDied","Data":"03216480f0cbf008beffc5f44fd741b6551ebf7246f8e25cc36a9023bccc4313"} Feb 20 16:44:22 crc kubenswrapper[4697]: I0220 16:44:22.585282 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5" event={"ID":"565bbe20-d8eb-4878-a048-4d78d8123f6d","Type":"ContainerStarted","Data":"deafef6452032e41b8f43390e79bae8d9d7edd4896167812de8ae7ac1f4e8c3b"} Feb 20 16:44:23 crc kubenswrapper[4697]: I0220 16:44:23.593739 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72v4q" event={"ID":"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f","Type":"ContainerStarted","Data":"c0de2030873ebd450d0b348d703f30bd7b15a042fac1ed72b1662d42351cdd3f"} Feb 20 16:44:29 crc kubenswrapper[4697]: I0220 16:44:29.495378 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:29 crc kubenswrapper[4697]: I0220 16:44:29.496419 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:30 crc kubenswrapper[4697]: I0220 16:44:30.556327 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-72v4q" podUID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" containerName="registry-server" probeResult="failure" output=< Feb 20 16:44:30 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 16:44:30 crc kubenswrapper[4697]: > Feb 20 16:44:31 crc kubenswrapper[4697]: I0220 16:44:31.962322 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-72v4q" podStartSLOduration=10.493054705 podStartE2EDuration="12.962302271s" podCreationTimestamp="2026-02-20 16:44:19 +0000 UTC" firstStartedPulling="2026-02-20 16:44:20.510924734 +0000 UTC m=+768.290970132" lastFinishedPulling="2026-02-20 16:44:22.98017229 +0000 UTC m=+770.760217698" observedRunningTime="2026-02-20 16:44:23.625767214 +0000 UTC m=+771.405812622" watchObservedRunningTime="2026-02-20 16:44:31.962302271 +0000 UTC m=+779.742347679" Feb 20 16:44:31 crc kubenswrapper[4697]: I0220 16:44:31.962875 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-55ddq"] Feb 20 16:44:31 crc kubenswrapper[4697]: I0220 16:44:31.963882 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:31 crc kubenswrapper[4697]: I0220 16:44:31.974515 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-55ddq"] Feb 20 16:44:32 crc kubenswrapper[4697]: I0220 16:44:32.120499 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6272a571-8bee-4b3b-bb56-bfbba6121604-catalog-content\") pod \"community-operators-55ddq\" (UID: \"6272a571-8bee-4b3b-bb56-bfbba6121604\") " pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:32 crc kubenswrapper[4697]: I0220 16:44:32.120542 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxrn7\" (UniqueName: \"kubernetes.io/projected/6272a571-8bee-4b3b-bb56-bfbba6121604-kube-api-access-cxrn7\") pod \"community-operators-55ddq\" (UID: \"6272a571-8bee-4b3b-bb56-bfbba6121604\") " pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:32 crc kubenswrapper[4697]: I0220 16:44:32.120588 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6272a571-8bee-4b3b-bb56-bfbba6121604-utilities\") pod \"community-operators-55ddq\" (UID: \"6272a571-8bee-4b3b-bb56-bfbba6121604\") " pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:32 crc kubenswrapper[4697]: I0220 16:44:32.221961 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6272a571-8bee-4b3b-bb56-bfbba6121604-catalog-content\") pod \"community-operators-55ddq\" (UID: \"6272a571-8bee-4b3b-bb56-bfbba6121604\") " pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:32 crc kubenswrapper[4697]: I0220 16:44:32.222016 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxrn7\" (UniqueName: \"kubernetes.io/projected/6272a571-8bee-4b3b-bb56-bfbba6121604-kube-api-access-cxrn7\") pod \"community-operators-55ddq\" (UID: \"6272a571-8bee-4b3b-bb56-bfbba6121604\") " pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:32 crc kubenswrapper[4697]: I0220 16:44:32.222072 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6272a571-8bee-4b3b-bb56-bfbba6121604-utilities\") pod \"community-operators-55ddq\" (UID: \"6272a571-8bee-4b3b-bb56-bfbba6121604\") " pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:32 crc kubenswrapper[4697]: I0220 16:44:32.222491 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6272a571-8bee-4b3b-bb56-bfbba6121604-catalog-content\") pod \"community-operators-55ddq\" (UID: \"6272a571-8bee-4b3b-bb56-bfbba6121604\") " pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:32 crc kubenswrapper[4697]: I0220 16:44:32.222606 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6272a571-8bee-4b3b-bb56-bfbba6121604-utilities\") pod \"community-operators-55ddq\" (UID: \"6272a571-8bee-4b3b-bb56-bfbba6121604\") " pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:32 crc kubenswrapper[4697]: I0220 16:44:32.253161 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxrn7\" (UniqueName: \"kubernetes.io/projected/6272a571-8bee-4b3b-bb56-bfbba6121604-kube-api-access-cxrn7\") pod \"community-operators-55ddq\" (UID: \"6272a571-8bee-4b3b-bb56-bfbba6121604\") " pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:32 crc kubenswrapper[4697]: I0220 16:44:32.280617 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.344634 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-55ddq"] Feb 20 16:44:34 crc kubenswrapper[4697]: W0220 16:44:34.357851 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6272a571_8bee_4b3b_bb56_bfbba6121604.slice/crio-4a5a1b55858525e94e7311d07a8af70eba50a62f510aede4516709e58222a36d WatchSource:0}: Error finding container 4a5a1b55858525e94e7311d07a8af70eba50a62f510aede4516709e58222a36d: Status 404 returned error can't find the container with id 4a5a1b55858525e94e7311d07a8af70eba50a62f510aede4516709e58222a36d Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.660405 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5" event={"ID":"565bbe20-d8eb-4878-a048-4d78d8123f6d","Type":"ContainerStarted","Data":"41ba3211a69f4ffd903220329d2a4737bfbe0b2945110f8305ee35a7a6406b3f"} Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.662559 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" event={"ID":"94f7dd38-b255-46cd-8b05-4c720857dd86","Type":"ContainerStarted","Data":"cd345469c6c97c22de29c4caf35c375240dc5ed725760d2c0ec4f3a756461be2"} Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.662609 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.663834 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" event={"ID":"12db7fd9-ce39-43cf-99b7-3a56791c0390","Type":"ContainerStarted","Data":"58ff8a7460c456f5e302360a54fb5f553f9daa59b287915cd4980e987abb6175"} Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.664034 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.666255 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw" event={"ID":"68c668ba-cbcc-4330-95e2-012c78108925","Type":"ContainerStarted","Data":"0f94e6334e79f44fd8c5b2076f654deef0163fd3f55da3afff235a4f077ead8a"} Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.667645 4697 generic.go:334] "Generic (PLEG): container finished" podID="6272a571-8bee-4b3b-bb56-bfbba6121604" containerID="79b8d83d16f940b22f68cf67d7b03585b57d87b061810d738d8d5b938d06a3b7" exitCode=0 Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.667684 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55ddq" event={"ID":"6272a571-8bee-4b3b-bb56-bfbba6121604","Type":"ContainerDied","Data":"79b8d83d16f940b22f68cf67d7b03585b57d87b061810d738d8d5b938d06a3b7"} Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.667726 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55ddq" event={"ID":"6272a571-8bee-4b3b-bb56-bfbba6121604","Type":"ContainerStarted","Data":"4a5a1b55858525e94e7311d07a8af70eba50a62f510aede4516709e58222a36d"} Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.670404 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cqddv" event={"ID":"214e6680-a3c8-4286-9c91-d68893ba73be","Type":"ContainerStarted","Data":"935e5ce6bc850712508c5cf776f6cb1731d11c98bf13f567ef4be736e91a7693"} Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.679025 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.684056 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5" podStartSLOduration=2.796386352 podStartE2EDuration="14.684042223s" podCreationTimestamp="2026-02-20 16:44:20 +0000 UTC" firstStartedPulling="2026-02-20 16:44:22.182809975 +0000 UTC m=+769.962855383" lastFinishedPulling="2026-02-20 16:44:34.070465846 +0000 UTC m=+781.850511254" observedRunningTime="2026-02-20 16:44:34.680192986 +0000 UTC m=+782.460238394" watchObservedRunningTime="2026-02-20 16:44:34.684042223 +0000 UTC m=+782.464087621" Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.713202 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8679bd8497-phkqw" podStartSLOduration=2.673176522 podStartE2EDuration="14.713184763s" podCreationTimestamp="2026-02-20 16:44:20 +0000 UTC" firstStartedPulling="2026-02-20 16:44:21.934718884 +0000 UTC m=+769.714764292" lastFinishedPulling="2026-02-20 16:44:33.974727125 +0000 UTC m=+781.754772533" observedRunningTime="2026-02-20 16:44:34.708350292 +0000 UTC m=+782.488395700" watchObservedRunningTime="2026-02-20 16:44:34.713184763 +0000 UTC m=+782.493230171" Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.726546 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-cqddv" podStartSLOduration=1.4779667810000001 podStartE2EDuration="14.726530228s" podCreationTimestamp="2026-02-20 16:44:20 +0000 UTC" firstStartedPulling="2026-02-20 16:44:20.722247879 +0000 UTC m=+768.502293287" lastFinishedPulling="2026-02-20 16:44:33.970811326 +0000 UTC m=+781.750856734" observedRunningTime="2026-02-20 16:44:34.723869261 +0000 UTC m=+782.503914669" watchObservedRunningTime="2026-02-20 16:44:34.726530228 +0000 UTC m=+782.506575636" Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.747894 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" podStartSLOduration=1.897824484 podStartE2EDuration="14.747877483s" podCreationTimestamp="2026-02-20 16:44:20 +0000 UTC" firstStartedPulling="2026-02-20 16:44:21.120119771 +0000 UTC m=+768.900165179" lastFinishedPulling="2026-02-20 16:44:33.97017277 +0000 UTC m=+781.750218178" observedRunningTime="2026-02-20 16:44:34.74615387 +0000 UTC m=+782.526199278" watchObservedRunningTime="2026-02-20 16:44:34.747877483 +0000 UTC m=+782.527922892" Feb 20 16:44:34 crc kubenswrapper[4697]: I0220 16:44:34.796650 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" podStartSLOduration=2.310091792 podStartE2EDuration="14.796633666s" podCreationTimestamp="2026-02-20 16:44:20 +0000 UTC" firstStartedPulling="2026-02-20 16:44:21.483677797 +0000 UTC m=+769.263723205" lastFinishedPulling="2026-02-20 16:44:33.970219671 +0000 UTC m=+781.750265079" observedRunningTime="2026-02-20 16:44:34.793955509 +0000 UTC m=+782.574000927" watchObservedRunningTime="2026-02-20 16:44:34.796633666 +0000 UTC m=+782.576679074" Feb 20 16:44:37 crc kubenswrapper[4697]: I0220 16:44:37.700090 4697 generic.go:334] "Generic (PLEG): container finished" podID="6272a571-8bee-4b3b-bb56-bfbba6121604" containerID="2d87c37fc8af37c59296d8c092711dec01f8ea4de68e751a05c2cef23a704f3b" exitCode=0 Feb 20 16:44:37 crc kubenswrapper[4697]: I0220 16:44:37.700413 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55ddq" event={"ID":"6272a571-8bee-4b3b-bb56-bfbba6121604","Type":"ContainerDied","Data":"2d87c37fc8af37c59296d8c092711dec01f8ea4de68e751a05c2cef23a704f3b"} Feb 20 16:44:38 crc kubenswrapper[4697]: I0220 16:44:38.708540 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55ddq" event={"ID":"6272a571-8bee-4b3b-bb56-bfbba6121604","Type":"ContainerStarted","Data":"294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3"} Feb 20 16:44:38 crc kubenswrapper[4697]: I0220 16:44:38.738372 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-55ddq" podStartSLOduration=4.103219424 podStartE2EDuration="7.738343124s" podCreationTimestamp="2026-02-20 16:44:31 +0000 UTC" firstStartedPulling="2026-02-20 16:44:34.668640176 +0000 UTC m=+782.448685584" lastFinishedPulling="2026-02-20 16:44:38.303763876 +0000 UTC m=+786.083809284" observedRunningTime="2026-02-20 16:44:38.735228386 +0000 UTC m=+786.515273794" watchObservedRunningTime="2026-02-20 16:44:38.738343124 +0000 UTC m=+786.518388532" Feb 20 16:44:39 crc kubenswrapper[4697]: I0220 16:44:39.534830 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:39 crc kubenswrapper[4697]: I0220 16:44:39.575465 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:41 crc kubenswrapper[4697]: I0220 16:44:41.005356 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-j5gtw" Feb 20 16:44:42 crc kubenswrapper[4697]: I0220 16:44:42.280921 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:42 crc kubenswrapper[4697]: I0220 16:44:42.281255 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:42 crc kubenswrapper[4697]: I0220 16:44:42.329517 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:43 crc kubenswrapper[4697]: I0220 16:44:43.153979 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-72v4q"] Feb 20 16:44:43 crc kubenswrapper[4697]: I0220 16:44:43.154268 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-72v4q" podUID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" containerName="registry-server" containerID="cri-o://c0de2030873ebd450d0b348d703f30bd7b15a042fac1ed72b1662d42351cdd3f" gracePeriod=2 Feb 20 16:44:43 crc kubenswrapper[4697]: I0220 16:44:43.733218 4697 generic.go:334] "Generic (PLEG): container finished" podID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" containerID="c0de2030873ebd450d0b348d703f30bd7b15a042fac1ed72b1662d42351cdd3f" exitCode=0 Feb 20 16:44:43 crc kubenswrapper[4697]: I0220 16:44:43.733264 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72v4q" event={"ID":"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f","Type":"ContainerDied","Data":"c0de2030873ebd450d0b348d703f30bd7b15a042fac1ed72b1662d42351cdd3f"} Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.092218 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.177902 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-catalog-content\") pod \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\" (UID: \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\") " Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.177949 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm2rf\" (UniqueName: \"kubernetes.io/projected/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-kube-api-access-cm2rf\") pod \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\" (UID: \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\") " Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.178038 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-utilities\") pod \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\" (UID: \"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f\") " Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.178933 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-utilities" (OuterVolumeSpecName: "utilities") pod "8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" (UID: "8fa20515-fb45-4f03-a6b7-bcbde3b6e40f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.179118 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.183620 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-kube-api-access-cm2rf" (OuterVolumeSpecName: "kube-api-access-cm2rf") pod "8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" (UID: "8fa20515-fb45-4f03-a6b7-bcbde3b6e40f"). InnerVolumeSpecName "kube-api-access-cm2rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.279777 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm2rf\" (UniqueName: \"kubernetes.io/projected/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-kube-api-access-cm2rf\") on node \"crc\" DevicePath \"\"" Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.308922 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" (UID: "8fa20515-fb45-4f03-a6b7-bcbde3b6e40f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.380448 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.740198 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72v4q" event={"ID":"8fa20515-fb45-4f03-a6b7-bcbde3b6e40f","Type":"ContainerDied","Data":"adc38b2830b3e96d049dec65fa9e99a6563ccb408ec69c613f3cc6a5ac82cb6b"} Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.740253 4697 scope.go:117] "RemoveContainer" containerID="c0de2030873ebd450d0b348d703f30bd7b15a042fac1ed72b1662d42351cdd3f" Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.740277 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-72v4q" Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.759076 4697 scope.go:117] "RemoveContainer" containerID="03216480f0cbf008beffc5f44fd741b6551ebf7246f8e25cc36a9023bccc4313" Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.766909 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-72v4q"] Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.771102 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-72v4q"] Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.794557 4697 scope.go:117] "RemoveContainer" containerID="585ce33744decb9beda974b3f1b5b54e8b12b09782ca8f142b090a529db44959" Feb 20 16:44:44 crc kubenswrapper[4697]: I0220 16:44:44.884195 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" path="/var/lib/kubelet/pods/8fa20515-fb45-4f03-a6b7-bcbde3b6e40f/volumes" Feb 20 16:44:52 crc kubenswrapper[4697]: I0220 16:44:52.333151 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:54 crc kubenswrapper[4697]: I0220 16:44:54.753077 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-55ddq"] Feb 20 16:44:54 crc kubenswrapper[4697]: I0220 16:44:54.753561 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-55ddq" podUID="6272a571-8bee-4b3b-bb56-bfbba6121604" containerName="registry-server" containerID="cri-o://294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3" gracePeriod=2 Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.125658 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.235023 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6272a571-8bee-4b3b-bb56-bfbba6121604-catalog-content\") pod \"6272a571-8bee-4b3b-bb56-bfbba6121604\" (UID: \"6272a571-8bee-4b3b-bb56-bfbba6121604\") " Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.235077 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxrn7\" (UniqueName: \"kubernetes.io/projected/6272a571-8bee-4b3b-bb56-bfbba6121604-kube-api-access-cxrn7\") pod \"6272a571-8bee-4b3b-bb56-bfbba6121604\" (UID: \"6272a571-8bee-4b3b-bb56-bfbba6121604\") " Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.235120 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6272a571-8bee-4b3b-bb56-bfbba6121604-utilities\") pod \"6272a571-8bee-4b3b-bb56-bfbba6121604\" (UID: \"6272a571-8bee-4b3b-bb56-bfbba6121604\") " Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.236093 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6272a571-8bee-4b3b-bb56-bfbba6121604-utilities" (OuterVolumeSpecName: "utilities") pod "6272a571-8bee-4b3b-bb56-bfbba6121604" (UID: "6272a571-8bee-4b3b-bb56-bfbba6121604"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.265314 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6272a571-8bee-4b3b-bb56-bfbba6121604-kube-api-access-cxrn7" (OuterVolumeSpecName: "kube-api-access-cxrn7") pod "6272a571-8bee-4b3b-bb56-bfbba6121604" (UID: "6272a571-8bee-4b3b-bb56-bfbba6121604"). InnerVolumeSpecName "kube-api-access-cxrn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.292176 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6272a571-8bee-4b3b-bb56-bfbba6121604-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6272a571-8bee-4b3b-bb56-bfbba6121604" (UID: "6272a571-8bee-4b3b-bb56-bfbba6121604"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.336912 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6272a571-8bee-4b3b-bb56-bfbba6121604-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.336955 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxrn7\" (UniqueName: \"kubernetes.io/projected/6272a571-8bee-4b3b-bb56-bfbba6121604-kube-api-access-cxrn7\") on node \"crc\" DevicePath \"\"" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.336971 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6272a571-8bee-4b3b-bb56-bfbba6121604-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.825580 4697 generic.go:334] "Generic (PLEG): container finished" podID="6272a571-8bee-4b3b-bb56-bfbba6121604" containerID="294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3" exitCode=0 Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.825635 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55ddq" event={"ID":"6272a571-8bee-4b3b-bb56-bfbba6121604","Type":"ContainerDied","Data":"294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3"} Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.825659 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-55ddq" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.825673 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55ddq" event={"ID":"6272a571-8bee-4b3b-bb56-bfbba6121604","Type":"ContainerDied","Data":"4a5a1b55858525e94e7311d07a8af70eba50a62f510aede4516709e58222a36d"} Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.825696 4697 scope.go:117] "RemoveContainer" containerID="294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.843009 4697 scope.go:117] "RemoveContainer" containerID="2d87c37fc8af37c59296d8c092711dec01f8ea4de68e751a05c2cef23a704f3b" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.867700 4697 scope.go:117] "RemoveContainer" containerID="79b8d83d16f940b22f68cf67d7b03585b57d87b061810d738d8d5b938d06a3b7" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.873481 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-55ddq"] Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.881195 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-55ddq"] Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.885567 4697 scope.go:117] "RemoveContainer" containerID="294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3" Feb 20 16:44:55 crc kubenswrapper[4697]: E0220 16:44:55.886096 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3\": container with ID starting with 294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3 not found: ID does not exist" containerID="294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.886141 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3"} err="failed to get container status \"294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3\": rpc error: code = NotFound desc = could not find container \"294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3\": container with ID starting with 294d7b3af467893e927f53f3b7e8a642510ae52ed111b19a25d71582139b3ad3 not found: ID does not exist" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.886168 4697 scope.go:117] "RemoveContainer" containerID="2d87c37fc8af37c59296d8c092711dec01f8ea4de68e751a05c2cef23a704f3b" Feb 20 16:44:55 crc kubenswrapper[4697]: E0220 16:44:55.886543 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d87c37fc8af37c59296d8c092711dec01f8ea4de68e751a05c2cef23a704f3b\": container with ID starting with 2d87c37fc8af37c59296d8c092711dec01f8ea4de68e751a05c2cef23a704f3b not found: ID does not exist" containerID="2d87c37fc8af37c59296d8c092711dec01f8ea4de68e751a05c2cef23a704f3b" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.886585 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d87c37fc8af37c59296d8c092711dec01f8ea4de68e751a05c2cef23a704f3b"} err="failed to get container status \"2d87c37fc8af37c59296d8c092711dec01f8ea4de68e751a05c2cef23a704f3b\": rpc error: code = NotFound desc = could not find container \"2d87c37fc8af37c59296d8c092711dec01f8ea4de68e751a05c2cef23a704f3b\": container with ID starting with 2d87c37fc8af37c59296d8c092711dec01f8ea4de68e751a05c2cef23a704f3b not found: ID does not exist" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.886612 4697 scope.go:117] "RemoveContainer" containerID="79b8d83d16f940b22f68cf67d7b03585b57d87b061810d738d8d5b938d06a3b7" Feb 20 16:44:55 crc kubenswrapper[4697]: E0220 16:44:55.887044 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b8d83d16f940b22f68cf67d7b03585b57d87b061810d738d8d5b938d06a3b7\": container with ID starting with 79b8d83d16f940b22f68cf67d7b03585b57d87b061810d738d8d5b938d06a3b7 not found: ID does not exist" containerID="79b8d83d16f940b22f68cf67d7b03585b57d87b061810d738d8d5b938d06a3b7" Feb 20 16:44:55 crc kubenswrapper[4697]: I0220 16:44:55.887115 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b8d83d16f940b22f68cf67d7b03585b57d87b061810d738d8d5b938d06a3b7"} err="failed to get container status \"79b8d83d16f940b22f68cf67d7b03585b57d87b061810d738d8d5b938d06a3b7\": rpc error: code = NotFound desc = could not find container \"79b8d83d16f940b22f68cf67d7b03585b57d87b061810d738d8d5b938d06a3b7\": container with ID starting with 79b8d83d16f940b22f68cf67d7b03585b57d87b061810d738d8d5b938d06a3b7 not found: ID does not exist" Feb 20 16:44:56 crc kubenswrapper[4697]: I0220 16:44:56.885783 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6272a571-8bee-4b3b-bb56-bfbba6121604" path="/var/lib/kubelet/pods/6272a571-8bee-4b3b-bb56-bfbba6121604/volumes" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.621926 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl"] Feb 20 16:44:57 crc kubenswrapper[4697]: E0220 16:44:57.622130 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" containerName="registry-server" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.622141 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" containerName="registry-server" Feb 20 16:44:57 crc kubenswrapper[4697]: E0220 16:44:57.622151 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" containerName="extract-utilities" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.622157 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" containerName="extract-utilities" Feb 20 16:44:57 crc kubenswrapper[4697]: E0220 16:44:57.622166 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6272a571-8bee-4b3b-bb56-bfbba6121604" containerName="extract-content" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.622172 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="6272a571-8bee-4b3b-bb56-bfbba6121604" containerName="extract-content" Feb 20 16:44:57 crc kubenswrapper[4697]: E0220 16:44:57.622181 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" containerName="extract-content" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.622187 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" containerName="extract-content" Feb 20 16:44:57 crc kubenswrapper[4697]: E0220 16:44:57.622195 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6272a571-8bee-4b3b-bb56-bfbba6121604" containerName="registry-server" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.622201 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="6272a571-8bee-4b3b-bb56-bfbba6121604" containerName="registry-server" Feb 20 16:44:57 crc kubenswrapper[4697]: E0220 16:44:57.622211 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6272a571-8bee-4b3b-bb56-bfbba6121604" containerName="extract-utilities" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.622216 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="6272a571-8bee-4b3b-bb56-bfbba6121604" containerName="extract-utilities" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.622308 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fa20515-fb45-4f03-a6b7-bcbde3b6e40f" containerName="registry-server" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.622320 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="6272a571-8bee-4b3b-bb56-bfbba6121604" containerName="registry-server" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.622998 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.625026 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.631901 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl"] Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.767527 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2547a2de-95cc-4068-9dc8-6ac185ccd3af-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl\" (UID: \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.767728 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2547a2de-95cc-4068-9dc8-6ac185ccd3af-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl\" (UID: \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.767809 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrpkh\" (UniqueName: \"kubernetes.io/projected/2547a2de-95cc-4068-9dc8-6ac185ccd3af-kube-api-access-lrpkh\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl\" (UID: \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.869765 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2547a2de-95cc-4068-9dc8-6ac185ccd3af-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl\" (UID: \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.869936 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2547a2de-95cc-4068-9dc8-6ac185ccd3af-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl\" (UID: \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.870006 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrpkh\" (UniqueName: \"kubernetes.io/projected/2547a2de-95cc-4068-9dc8-6ac185ccd3af-kube-api-access-lrpkh\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl\" (UID: \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.870236 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2547a2de-95cc-4068-9dc8-6ac185ccd3af-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl\" (UID: \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.870397 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2547a2de-95cc-4068-9dc8-6ac185ccd3af-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl\" (UID: \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.896308 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrpkh\" (UniqueName: \"kubernetes.io/projected/2547a2de-95cc-4068-9dc8-6ac185ccd3af-kube-api-access-lrpkh\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl\" (UID: \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:44:57 crc kubenswrapper[4697]: I0220 16:44:57.936639 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:44:58 crc kubenswrapper[4697]: I0220 16:44:58.152621 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl"] Feb 20 16:44:58 crc kubenswrapper[4697]: I0220 16:44:58.844939 4697 generic.go:334] "Generic (PLEG): container finished" podID="2547a2de-95cc-4068-9dc8-6ac185ccd3af" containerID="7edbc08227736bafcdc1336796e4a9bd91e91a7fd379d63e39cd1244a741bcf2" exitCode=0 Feb 20 16:44:58 crc kubenswrapper[4697]: I0220 16:44:58.845051 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" event={"ID":"2547a2de-95cc-4068-9dc8-6ac185ccd3af","Type":"ContainerDied","Data":"7edbc08227736bafcdc1336796e4a9bd91e91a7fd379d63e39cd1244a741bcf2"} Feb 20 16:44:58 crc kubenswrapper[4697]: I0220 16:44:58.845372 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" event={"ID":"2547a2de-95cc-4068-9dc8-6ac185ccd3af","Type":"ContainerStarted","Data":"fccd240e23f7965577b491c31bf1887f31bf92770e05e38976e2c2bb1c08d9e0"} Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.152843 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh"] Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.155135 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.172524 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.173360 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.180533 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh"] Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.302193 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3c8641f-25ce-4499-9cfc-ba7c464f2097-secret-volume\") pod \"collect-profiles-29526765-qhthh\" (UID: \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.302232 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgs7g\" (UniqueName: \"kubernetes.io/projected/d3c8641f-25ce-4499-9cfc-ba7c464f2097-kube-api-access-tgs7g\") pod \"collect-profiles-29526765-qhthh\" (UID: \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.302418 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3c8641f-25ce-4499-9cfc-ba7c464f2097-config-volume\") pod \"collect-profiles-29526765-qhthh\" (UID: \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.403607 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3c8641f-25ce-4499-9cfc-ba7c464f2097-config-volume\") pod \"collect-profiles-29526765-qhthh\" (UID: \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.403671 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3c8641f-25ce-4499-9cfc-ba7c464f2097-secret-volume\") pod \"collect-profiles-29526765-qhthh\" (UID: \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.403690 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgs7g\" (UniqueName: \"kubernetes.io/projected/d3c8641f-25ce-4499-9cfc-ba7c464f2097-kube-api-access-tgs7g\") pod \"collect-profiles-29526765-qhthh\" (UID: \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.404460 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3c8641f-25ce-4499-9cfc-ba7c464f2097-config-volume\") pod \"collect-profiles-29526765-qhthh\" (UID: \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.409052 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3c8641f-25ce-4499-9cfc-ba7c464f2097-secret-volume\") pod \"collect-profiles-29526765-qhthh\" (UID: \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.427460 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgs7g\" (UniqueName: \"kubernetes.io/projected/d3c8641f-25ce-4499-9cfc-ba7c464f2097-kube-api-access-tgs7g\") pod \"collect-profiles-29526765-qhthh\" (UID: \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.497987 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.859336 4697 generic.go:334] "Generic (PLEG): container finished" podID="2547a2de-95cc-4068-9dc8-6ac185ccd3af" containerID="238b9709bca7d0856a0c40234f8e23f15d237a9a366bfe918767e8f6a9e21d71" exitCode=0 Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.859556 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" event={"ID":"2547a2de-95cc-4068-9dc8-6ac185ccd3af","Type":"ContainerDied","Data":"238b9709bca7d0856a0c40234f8e23f15d237a9a366bfe918767e8f6a9e21d71"} Feb 20 16:45:00 crc kubenswrapper[4697]: I0220 16:45:00.884964 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh"] Feb 20 16:45:00 crc kubenswrapper[4697]: W0220 16:45:00.889228 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3c8641f_25ce_4499_9cfc_ba7c464f2097.slice/crio-df28ad677264e3150aa8c36c05d175e2020f31d3e11fa3abb1b4f2ada9e19c81 WatchSource:0}: Error finding container df28ad677264e3150aa8c36c05d175e2020f31d3e11fa3abb1b4f2ada9e19c81: Status 404 returned error can't find the container with id df28ad677264e3150aa8c36c05d175e2020f31d3e11fa3abb1b4f2ada9e19c81 Feb 20 16:45:01 crc kubenswrapper[4697]: I0220 16:45:01.888977 4697 generic.go:334] "Generic (PLEG): container finished" podID="2547a2de-95cc-4068-9dc8-6ac185ccd3af" containerID="65bb9c6f1625084728db7e72c40c52f9f7081e43858dacb9347b179e39a3dc59" exitCode=0 Feb 20 16:45:01 crc kubenswrapper[4697]: I0220 16:45:01.889044 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" event={"ID":"2547a2de-95cc-4068-9dc8-6ac185ccd3af","Type":"ContainerDied","Data":"65bb9c6f1625084728db7e72c40c52f9f7081e43858dacb9347b179e39a3dc59"} Feb 20 16:45:01 crc kubenswrapper[4697]: I0220 16:45:01.890814 4697 generic.go:334] "Generic (PLEG): container finished" podID="d3c8641f-25ce-4499-9cfc-ba7c464f2097" containerID="d821b6203adb7f9f04719048cd1fc9e494b0bab5605a9740d71e93eace5e0b25" exitCode=0 Feb 20 16:45:01 crc kubenswrapper[4697]: I0220 16:45:01.890861 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" event={"ID":"d3c8641f-25ce-4499-9cfc-ba7c464f2097","Type":"ContainerDied","Data":"d821b6203adb7f9f04719048cd1fc9e494b0bab5605a9740d71e93eace5e0b25"} Feb 20 16:45:01 crc kubenswrapper[4697]: I0220 16:45:01.890884 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" event={"ID":"d3c8641f-25ce-4499-9cfc-ba7c464f2097","Type":"ContainerStarted","Data":"df28ad677264e3150aa8c36c05d175e2020f31d3e11fa3abb1b4f2ada9e19c81"} Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.148959 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.154072 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.338205 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrpkh\" (UniqueName: \"kubernetes.io/projected/2547a2de-95cc-4068-9dc8-6ac185ccd3af-kube-api-access-lrpkh\") pod \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\" (UID: \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\") " Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.338265 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgs7g\" (UniqueName: \"kubernetes.io/projected/d3c8641f-25ce-4499-9cfc-ba7c464f2097-kube-api-access-tgs7g\") pod \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\" (UID: \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\") " Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.338315 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2547a2de-95cc-4068-9dc8-6ac185ccd3af-util\") pod \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\" (UID: \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\") " Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.338353 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3c8641f-25ce-4499-9cfc-ba7c464f2097-config-volume\") pod \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\" (UID: \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\") " Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.338407 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3c8641f-25ce-4499-9cfc-ba7c464f2097-secret-volume\") pod \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\" (UID: \"d3c8641f-25ce-4499-9cfc-ba7c464f2097\") " Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.338458 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2547a2de-95cc-4068-9dc8-6ac185ccd3af-bundle\") pod \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\" (UID: \"2547a2de-95cc-4068-9dc8-6ac185ccd3af\") " Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.339206 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3c8641f-25ce-4499-9cfc-ba7c464f2097-config-volume" (OuterVolumeSpecName: "config-volume") pod "d3c8641f-25ce-4499-9cfc-ba7c464f2097" (UID: "d3c8641f-25ce-4499-9cfc-ba7c464f2097"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.339228 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2547a2de-95cc-4068-9dc8-6ac185ccd3af-bundle" (OuterVolumeSpecName: "bundle") pod "2547a2de-95cc-4068-9dc8-6ac185ccd3af" (UID: "2547a2de-95cc-4068-9dc8-6ac185ccd3af"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.343167 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2547a2de-95cc-4068-9dc8-6ac185ccd3af-kube-api-access-lrpkh" (OuterVolumeSpecName: "kube-api-access-lrpkh") pod "2547a2de-95cc-4068-9dc8-6ac185ccd3af" (UID: "2547a2de-95cc-4068-9dc8-6ac185ccd3af"). InnerVolumeSpecName "kube-api-access-lrpkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.343157 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3c8641f-25ce-4499-9cfc-ba7c464f2097-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d3c8641f-25ce-4499-9cfc-ba7c464f2097" (UID: "d3c8641f-25ce-4499-9cfc-ba7c464f2097"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.343306 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3c8641f-25ce-4499-9cfc-ba7c464f2097-kube-api-access-tgs7g" (OuterVolumeSpecName: "kube-api-access-tgs7g") pod "d3c8641f-25ce-4499-9cfc-ba7c464f2097" (UID: "d3c8641f-25ce-4499-9cfc-ba7c464f2097"). InnerVolumeSpecName "kube-api-access-tgs7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.353905 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2547a2de-95cc-4068-9dc8-6ac185ccd3af-util" (OuterVolumeSpecName: "util") pod "2547a2de-95cc-4068-9dc8-6ac185ccd3af" (UID: "2547a2de-95cc-4068-9dc8-6ac185ccd3af"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.439300 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrpkh\" (UniqueName: \"kubernetes.io/projected/2547a2de-95cc-4068-9dc8-6ac185ccd3af-kube-api-access-lrpkh\") on node \"crc\" DevicePath \"\"" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.439331 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgs7g\" (UniqueName: \"kubernetes.io/projected/d3c8641f-25ce-4499-9cfc-ba7c464f2097-kube-api-access-tgs7g\") on node \"crc\" DevicePath \"\"" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.439340 4697 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2547a2de-95cc-4068-9dc8-6ac185ccd3af-util\") on node \"crc\" DevicePath \"\"" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.439351 4697 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3c8641f-25ce-4499-9cfc-ba7c464f2097-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.439359 4697 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3c8641f-25ce-4499-9cfc-ba7c464f2097-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.439367 4697 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2547a2de-95cc-4068-9dc8-6ac185ccd3af-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.904715 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" event={"ID":"2547a2de-95cc-4068-9dc8-6ac185ccd3af","Type":"ContainerDied","Data":"fccd240e23f7965577b491c31bf1887f31bf92770e05e38976e2c2bb1c08d9e0"} Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.904752 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fccd240e23f7965577b491c31bf1887f31bf92770e05e38976e2c2bb1c08d9e0" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.904751 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.906196 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" event={"ID":"d3c8641f-25ce-4499-9cfc-ba7c464f2097","Type":"ContainerDied","Data":"df28ad677264e3150aa8c36c05d175e2020f31d3e11fa3abb1b4f2ada9e19c81"} Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.906287 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df28ad677264e3150aa8c36c05d175e2020f31d3e11fa3abb1b4f2ada9e19c81" Feb 20 16:45:03 crc kubenswrapper[4697]: I0220 16:45:03.906250 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.398076 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-ljdjd"] Feb 20 16:45:08 crc kubenswrapper[4697]: E0220 16:45:08.398815 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c8641f-25ce-4499-9cfc-ba7c464f2097" containerName="collect-profiles" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.398827 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c8641f-25ce-4499-9cfc-ba7c464f2097" containerName="collect-profiles" Feb 20 16:45:08 crc kubenswrapper[4697]: E0220 16:45:08.398850 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2547a2de-95cc-4068-9dc8-6ac185ccd3af" containerName="util" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.398856 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="2547a2de-95cc-4068-9dc8-6ac185ccd3af" containerName="util" Feb 20 16:45:08 crc kubenswrapper[4697]: E0220 16:45:08.398865 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2547a2de-95cc-4068-9dc8-6ac185ccd3af" containerName="pull" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.398872 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="2547a2de-95cc-4068-9dc8-6ac185ccd3af" containerName="pull" Feb 20 16:45:08 crc kubenswrapper[4697]: E0220 16:45:08.398880 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2547a2de-95cc-4068-9dc8-6ac185ccd3af" containerName="extract" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.398885 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="2547a2de-95cc-4068-9dc8-6ac185ccd3af" containerName="extract" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.398977 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="2547a2de-95cc-4068-9dc8-6ac185ccd3af" containerName="extract" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.398993 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c8641f-25ce-4499-9cfc-ba7c464f2097" containerName="collect-profiles" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.399368 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-ljdjd" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.401588 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.401724 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-9nkmb" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.401728 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.410595 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-ljdjd"] Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.498557 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnwqj\" (UniqueName: \"kubernetes.io/projected/278cea78-f0e4-4785-bc49-aa335706ccac-kube-api-access-vnwqj\") pod \"nmstate-operator-694c9596b7-ljdjd\" (UID: \"278cea78-f0e4-4785-bc49-aa335706ccac\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-ljdjd" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.600255 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnwqj\" (UniqueName: \"kubernetes.io/projected/278cea78-f0e4-4785-bc49-aa335706ccac-kube-api-access-vnwqj\") pod \"nmstate-operator-694c9596b7-ljdjd\" (UID: \"278cea78-f0e4-4785-bc49-aa335706ccac\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-ljdjd" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.625348 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnwqj\" (UniqueName: \"kubernetes.io/projected/278cea78-f0e4-4785-bc49-aa335706ccac-kube-api-access-vnwqj\") pod \"nmstate-operator-694c9596b7-ljdjd\" (UID: \"278cea78-f0e4-4785-bc49-aa335706ccac\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-ljdjd" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.723254 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-ljdjd" Feb 20 16:45:08 crc kubenswrapper[4697]: I0220 16:45:08.977829 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-ljdjd"] Feb 20 16:45:08 crc kubenswrapper[4697]: W0220 16:45:08.991731 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod278cea78_f0e4_4785_bc49_aa335706ccac.slice/crio-cf408a2245b155b5d9f02afdd1c709493660938dfc5c96b56f378d099ab059b7 WatchSource:0}: Error finding container cf408a2245b155b5d9f02afdd1c709493660938dfc5c96b56f378d099ab059b7: Status 404 returned error can't find the container with id cf408a2245b155b5d9f02afdd1c709493660938dfc5c96b56f378d099ab059b7 Feb 20 16:45:09 crc kubenswrapper[4697]: I0220 16:45:09.943369 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-ljdjd" event={"ID":"278cea78-f0e4-4785-bc49-aa335706ccac","Type":"ContainerStarted","Data":"cf408a2245b155b5d9f02afdd1c709493660938dfc5c96b56f378d099ab059b7"} Feb 20 16:45:11 crc kubenswrapper[4697]: I0220 16:45:11.961687 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-ljdjd" event={"ID":"278cea78-f0e4-4785-bc49-aa335706ccac","Type":"ContainerStarted","Data":"b14e377889985e64b55bace5a542080b85e9442c943c5c49f8459f0de5bd3b5a"} Feb 20 16:45:11 crc kubenswrapper[4697]: I0220 16:45:11.984291 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-ljdjd" podStartSLOduration=1.989958705 podStartE2EDuration="3.984251827s" podCreationTimestamp="2026-02-20 16:45:08 +0000 UTC" firstStartedPulling="2026-02-20 16:45:08.993371494 +0000 UTC m=+816.773416902" lastFinishedPulling="2026-02-20 16:45:10.987664616 +0000 UTC m=+818.767710024" observedRunningTime="2026-02-20 16:45:11.981649042 +0000 UTC m=+819.761694460" watchObservedRunningTime="2026-02-20 16:45:11.984251827 +0000 UTC m=+819.764297245" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.210340 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-vwprb"] Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.211950 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-vwprb" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.215049 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc"] Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.215415 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-8m4b9" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.216011 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.223716 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.244574 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc"] Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.254792 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mmnb6"] Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.256027 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.263040 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-vwprb"] Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.333960 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9"] Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.334845 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.338456 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.338532 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.338689 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-vcfvk" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.348675 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9"] Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.399423 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nshvc\" (UniqueName: \"kubernetes.io/projected/567d32fd-31d2-4822-b487-ec35c250663d-kube-api-access-nshvc\") pod \"nmstate-webhook-866bcb46dc-xmzrc\" (UID: \"567d32fd-31d2-4822-b487-ec35c250663d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.399502 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj9vj\" (UniqueName: \"kubernetes.io/projected/7b12f1ce-9e0b-458b-990a-e38c2f2139c5-kube-api-access-dj9vj\") pod \"nmstate-metrics-58c85c668d-vwprb\" (UID: \"7b12f1ce-9e0b-458b-990a-e38c2f2139c5\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-vwprb" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.399531 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/567d32fd-31d2-4822-b487-ec35c250663d-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-xmzrc\" (UID: \"567d32fd-31d2-4822-b487-ec35c250663d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.399550 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/89578f54-097d-4b2b-9809-103034a3a114-ovs-socket\") pod \"nmstate-handler-mmnb6\" (UID: \"89578f54-097d-4b2b-9809-103034a3a114\") " pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.399566 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pksq4\" (UniqueName: \"kubernetes.io/projected/89578f54-097d-4b2b-9809-103034a3a114-kube-api-access-pksq4\") pod \"nmstate-handler-mmnb6\" (UID: \"89578f54-097d-4b2b-9809-103034a3a114\") " pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.399590 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/89578f54-097d-4b2b-9809-103034a3a114-dbus-socket\") pod \"nmstate-handler-mmnb6\" (UID: \"89578f54-097d-4b2b-9809-103034a3a114\") " pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.399609 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/89578f54-097d-4b2b-9809-103034a3a114-nmstate-lock\") pod \"nmstate-handler-mmnb6\" (UID: \"89578f54-097d-4b2b-9809-103034a3a114\") " pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.500704 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/89578f54-097d-4b2b-9809-103034a3a114-nmstate-lock\") pod \"nmstate-handler-mmnb6\" (UID: \"89578f54-097d-4b2b-9809-103034a3a114\") " pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.500753 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d88c9050-548c-4305-b532-085c83436b3e-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-9x7l9\" (UID: \"d88c9050-548c-4305-b532-085c83436b3e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.500791 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nshvc\" (UniqueName: \"kubernetes.io/projected/567d32fd-31d2-4822-b487-ec35c250663d-kube-api-access-nshvc\") pod \"nmstate-webhook-866bcb46dc-xmzrc\" (UID: \"567d32fd-31d2-4822-b487-ec35c250663d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.500826 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/89578f54-097d-4b2b-9809-103034a3a114-nmstate-lock\") pod \"nmstate-handler-mmnb6\" (UID: \"89578f54-097d-4b2b-9809-103034a3a114\") " pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.500834 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d88c9050-548c-4305-b532-085c83436b3e-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-9x7l9\" (UID: \"d88c9050-548c-4305-b532-085c83436b3e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.500897 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8jk4\" (UniqueName: \"kubernetes.io/projected/d88c9050-548c-4305-b532-085c83436b3e-kube-api-access-r8jk4\") pod \"nmstate-console-plugin-5c78fc5d65-9x7l9\" (UID: \"d88c9050-548c-4305-b532-085c83436b3e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.500932 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj9vj\" (UniqueName: \"kubernetes.io/projected/7b12f1ce-9e0b-458b-990a-e38c2f2139c5-kube-api-access-dj9vj\") pod \"nmstate-metrics-58c85c668d-vwprb\" (UID: \"7b12f1ce-9e0b-458b-990a-e38c2f2139c5\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-vwprb" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.500979 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/567d32fd-31d2-4822-b487-ec35c250663d-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-xmzrc\" (UID: \"567d32fd-31d2-4822-b487-ec35c250663d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.500995 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/89578f54-097d-4b2b-9809-103034a3a114-ovs-socket\") pod \"nmstate-handler-mmnb6\" (UID: \"89578f54-097d-4b2b-9809-103034a3a114\") " pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.501009 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pksq4\" (UniqueName: \"kubernetes.io/projected/89578f54-097d-4b2b-9809-103034a3a114-kube-api-access-pksq4\") pod \"nmstate-handler-mmnb6\" (UID: \"89578f54-097d-4b2b-9809-103034a3a114\") " pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.501227 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/89578f54-097d-4b2b-9809-103034a3a114-dbus-socket\") pod \"nmstate-handler-mmnb6\" (UID: \"89578f54-097d-4b2b-9809-103034a3a114\") " pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.501106 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/89578f54-097d-4b2b-9809-103034a3a114-ovs-socket\") pod \"nmstate-handler-mmnb6\" (UID: \"89578f54-097d-4b2b-9809-103034a3a114\") " pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: E0220 16:45:24.501086 4697 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 20 16:45:24 crc kubenswrapper[4697]: E0220 16:45:24.501359 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567d32fd-31d2-4822-b487-ec35c250663d-tls-key-pair podName:567d32fd-31d2-4822-b487-ec35c250663d nodeName:}" failed. No retries permitted until 2026-02-20 16:45:25.001345307 +0000 UTC m=+832.781390715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/567d32fd-31d2-4822-b487-ec35c250663d-tls-key-pair") pod "nmstate-webhook-866bcb46dc-xmzrc" (UID: "567d32fd-31d2-4822-b487-ec35c250663d") : secret "openshift-nmstate-webhook" not found Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.501741 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/89578f54-097d-4b2b-9809-103034a3a114-dbus-socket\") pod \"nmstate-handler-mmnb6\" (UID: \"89578f54-097d-4b2b-9809-103034a3a114\") " pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.524504 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj9vj\" (UniqueName: \"kubernetes.io/projected/7b12f1ce-9e0b-458b-990a-e38c2f2139c5-kube-api-access-dj9vj\") pod \"nmstate-metrics-58c85c668d-vwprb\" (UID: \"7b12f1ce-9e0b-458b-990a-e38c2f2139c5\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-vwprb" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.525214 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nshvc\" (UniqueName: \"kubernetes.io/projected/567d32fd-31d2-4822-b487-ec35c250663d-kube-api-access-nshvc\") pod \"nmstate-webhook-866bcb46dc-xmzrc\" (UID: \"567d32fd-31d2-4822-b487-ec35c250663d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.525403 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pksq4\" (UniqueName: \"kubernetes.io/projected/89578f54-097d-4b2b-9809-103034a3a114-kube-api-access-pksq4\") pod \"nmstate-handler-mmnb6\" (UID: \"89578f54-097d-4b2b-9809-103034a3a114\") " pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.534098 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-vwprb" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.539250 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c865cc74f-r5qbl"] Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.540045 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.551765 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c865cc74f-r5qbl"] Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.570335 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.601866 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d88c9050-548c-4305-b532-085c83436b3e-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-9x7l9\" (UID: \"d88c9050-548c-4305-b532-085c83436b3e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.601909 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8jk4\" (UniqueName: \"kubernetes.io/projected/d88c9050-548c-4305-b532-085c83436b3e-kube-api-access-r8jk4\") pod \"nmstate-console-plugin-5c78fc5d65-9x7l9\" (UID: \"d88c9050-548c-4305-b532-085c83436b3e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.601970 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d88c9050-548c-4305-b532-085c83436b3e-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-9x7l9\" (UID: \"d88c9050-548c-4305-b532-085c83436b3e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.602742 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d88c9050-548c-4305-b532-085c83436b3e-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-9x7l9\" (UID: \"d88c9050-548c-4305-b532-085c83436b3e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.609584 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d88c9050-548c-4305-b532-085c83436b3e-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-9x7l9\" (UID: \"d88c9050-548c-4305-b532-085c83436b3e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.626167 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8jk4\" (UniqueName: \"kubernetes.io/projected/d88c9050-548c-4305-b532-085c83436b3e-kube-api-access-r8jk4\") pod \"nmstate-console-plugin-5c78fc5d65-9x7l9\" (UID: \"d88c9050-548c-4305-b532-085c83436b3e\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.647522 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.703551 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-console-serving-cert\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.703607 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-trusted-ca-bundle\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.703645 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pd5r\" (UniqueName: \"kubernetes.io/projected/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-kube-api-access-7pd5r\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.703701 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-oauth-serving-cert\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.703808 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-console-config\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.703864 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-service-ca\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.703893 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-console-oauth-config\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.805041 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-service-ca\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.805076 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-console-oauth-config\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.805114 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-console-serving-cert\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.805134 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-trusted-ca-bundle\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.805171 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pd5r\" (UniqueName: \"kubernetes.io/projected/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-kube-api-access-7pd5r\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.805204 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-oauth-serving-cert\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.805236 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-console-config\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.806309 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-console-config\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.806319 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-oauth-serving-cert\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.806450 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-trusted-ca-bundle\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.806711 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-service-ca\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.815105 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-console-serving-cert\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.815106 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-console-oauth-config\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.822820 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pd5r\" (UniqueName: \"kubernetes.io/projected/0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8-kube-api-access-7pd5r\") pod \"console-7c865cc74f-r5qbl\" (UID: \"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8\") " pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.939798 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-vwprb"] Feb 20 16:45:24 crc kubenswrapper[4697]: I0220 16:45:24.947208 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:24 crc kubenswrapper[4697]: W0220 16:45:24.955874 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b12f1ce_9e0b_458b_990a_e38c2f2139c5.slice/crio-378a2fe71d302c350aebc01283207548e3e00ff1c4ad2dc696ba2c8d9f07c673 WatchSource:0}: Error finding container 378a2fe71d302c350aebc01283207548e3e00ff1c4ad2dc696ba2c8d9f07c673: Status 404 returned error can't find the container with id 378a2fe71d302c350aebc01283207548e3e00ff1c4ad2dc696ba2c8d9f07c673 Feb 20 16:45:25 crc kubenswrapper[4697]: I0220 16:45:25.007754 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/567d32fd-31d2-4822-b487-ec35c250663d-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-xmzrc\" (UID: \"567d32fd-31d2-4822-b487-ec35c250663d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" Feb 20 16:45:25 crc kubenswrapper[4697]: I0220 16:45:25.011262 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/567d32fd-31d2-4822-b487-ec35c250663d-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-xmzrc\" (UID: \"567d32fd-31d2-4822-b487-ec35c250663d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" Feb 20 16:45:25 crc kubenswrapper[4697]: I0220 16:45:25.033897 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9"] Feb 20 16:45:25 crc kubenswrapper[4697]: I0220 16:45:25.038991 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mmnb6" event={"ID":"89578f54-097d-4b2b-9809-103034a3a114","Type":"ContainerStarted","Data":"5e5c731108de396e437383bd9209be240b457393cf74c23994f8a353dbe02b13"} Feb 20 16:45:25 crc kubenswrapper[4697]: I0220 16:45:25.041178 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-vwprb" event={"ID":"7b12f1ce-9e0b-458b-990a-e38c2f2139c5","Type":"ContainerStarted","Data":"378a2fe71d302c350aebc01283207548e3e00ff1c4ad2dc696ba2c8d9f07c673"} Feb 20 16:45:25 crc kubenswrapper[4697]: I0220 16:45:25.146878 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" Feb 20 16:45:25 crc kubenswrapper[4697]: I0220 16:45:25.162852 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c865cc74f-r5qbl"] Feb 20 16:45:25 crc kubenswrapper[4697]: W0220 16:45:25.171025 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d1e467b_80f1_4ad2_a9c6_06fd8d2fbdb8.slice/crio-d97ce4e1ffb72c44c427a35c4746806638656955e10614d1ab7614579ed0cf06 WatchSource:0}: Error finding container d97ce4e1ffb72c44c427a35c4746806638656955e10614d1ab7614579ed0cf06: Status 404 returned error can't find the container with id d97ce4e1ffb72c44c427a35c4746806638656955e10614d1ab7614579ed0cf06 Feb 20 16:45:25 crc kubenswrapper[4697]: I0220 16:45:25.353253 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc"] Feb 20 16:45:25 crc kubenswrapper[4697]: W0220 16:45:25.358284 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod567d32fd_31d2_4822_b487_ec35c250663d.slice/crio-7ab802216f0a5a50f4c5ecc16a72995bd735aff9e948ec2eb220ab0a4505174c WatchSource:0}: Error finding container 7ab802216f0a5a50f4c5ecc16a72995bd735aff9e948ec2eb220ab0a4505174c: Status 404 returned error can't find the container with id 7ab802216f0a5a50f4c5ecc16a72995bd735aff9e948ec2eb220ab0a4505174c Feb 20 16:45:26 crc kubenswrapper[4697]: I0220 16:45:26.047247 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c865cc74f-r5qbl" event={"ID":"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8","Type":"ContainerStarted","Data":"b4aad7eddca9137a3c9bc3028df9c92777fe92b69f296af659ce5e7dfcd78330"} Feb 20 16:45:26 crc kubenswrapper[4697]: I0220 16:45:26.047296 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c865cc74f-r5qbl" event={"ID":"0d1e467b-80f1-4ad2-a9c6-06fd8d2fbdb8","Type":"ContainerStarted","Data":"d97ce4e1ffb72c44c427a35c4746806638656955e10614d1ab7614579ed0cf06"} Feb 20 16:45:26 crc kubenswrapper[4697]: I0220 16:45:26.050035 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" event={"ID":"d88c9050-548c-4305-b532-085c83436b3e","Type":"ContainerStarted","Data":"100b077fdea3edbf6f7f57f8449edfe87369a1c8831990b66e61ba3b3188793f"} Feb 20 16:45:26 crc kubenswrapper[4697]: I0220 16:45:26.052147 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" event={"ID":"567d32fd-31d2-4822-b487-ec35c250663d","Type":"ContainerStarted","Data":"7ab802216f0a5a50f4c5ecc16a72995bd735aff9e948ec2eb220ab0a4505174c"} Feb 20 16:45:26 crc kubenswrapper[4697]: I0220 16:45:26.072529 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c865cc74f-r5qbl" podStartSLOduration=2.072509148 podStartE2EDuration="2.072509148s" podCreationTimestamp="2026-02-20 16:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:45:26.064765033 +0000 UTC m=+833.844810461" watchObservedRunningTime="2026-02-20 16:45:26.072509148 +0000 UTC m=+833.852554556" Feb 20 16:45:28 crc kubenswrapper[4697]: I0220 16:45:28.064557 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" event={"ID":"d88c9050-548c-4305-b532-085c83436b3e","Type":"ContainerStarted","Data":"cfc2c6f95f4c8cb5546edd2704fa3aa76b2e86aa5fffbbf6750374849c18111e"} Feb 20 16:45:28 crc kubenswrapper[4697]: I0220 16:45:28.067749 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" event={"ID":"567d32fd-31d2-4822-b487-ec35c250663d","Type":"ContainerStarted","Data":"0794669da67b6fad630350d7e9c30813b60ba511a9877175339fae17b755e805"} Feb 20 16:45:28 crc kubenswrapper[4697]: I0220 16:45:28.067939 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" Feb 20 16:45:28 crc kubenswrapper[4697]: I0220 16:45:28.069595 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-vwprb" event={"ID":"7b12f1ce-9e0b-458b-990a-e38c2f2139c5","Type":"ContainerStarted","Data":"bd4da17bca2e9e488c1b70799f07ab48453ef144f8a28dc28864281d5499f151"} Feb 20 16:45:28 crc kubenswrapper[4697]: I0220 16:45:28.071130 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mmnb6" event={"ID":"89578f54-097d-4b2b-9809-103034a3a114","Type":"ContainerStarted","Data":"84d491bfd726e973a9d4ad91149610ce4f59dbc5e449069c2a4a13bbbc11a5f9"} Feb 20 16:45:28 crc kubenswrapper[4697]: I0220 16:45:28.071259 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:28 crc kubenswrapper[4697]: I0220 16:45:28.082898 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9x7l9" podStartSLOduration=1.99679805 podStartE2EDuration="4.082877663s" podCreationTimestamp="2026-02-20 16:45:24 +0000 UTC" firstStartedPulling="2026-02-20 16:45:25.048736305 +0000 UTC m=+832.828781713" lastFinishedPulling="2026-02-20 16:45:27.134815918 +0000 UTC m=+834.914861326" observedRunningTime="2026-02-20 16:45:28.076224106 +0000 UTC m=+835.856269514" watchObservedRunningTime="2026-02-20 16:45:28.082877663 +0000 UTC m=+835.862923071" Feb 20 16:45:28 crc kubenswrapper[4697]: I0220 16:45:28.095241 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mmnb6" podStartSLOduration=1.577786803 podStartE2EDuration="4.095216073s" podCreationTimestamp="2026-02-20 16:45:24 +0000 UTC" firstStartedPulling="2026-02-20 16:45:24.611771857 +0000 UTC m=+832.391817265" lastFinishedPulling="2026-02-20 16:45:27.129201107 +0000 UTC m=+834.909246535" observedRunningTime="2026-02-20 16:45:28.089467258 +0000 UTC m=+835.869512666" watchObservedRunningTime="2026-02-20 16:45:28.095216073 +0000 UTC m=+835.875261481" Feb 20 16:45:28 crc kubenswrapper[4697]: I0220 16:45:28.110500 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" podStartSLOduration=2.335692359 podStartE2EDuration="4.110476635s" podCreationTimestamp="2026-02-20 16:45:24 +0000 UTC" firstStartedPulling="2026-02-20 16:45:25.361204191 +0000 UTC m=+833.141249609" lastFinishedPulling="2026-02-20 16:45:27.135988457 +0000 UTC m=+834.916033885" observedRunningTime="2026-02-20 16:45:28.103263454 +0000 UTC m=+835.883308872" watchObservedRunningTime="2026-02-20 16:45:28.110476635 +0000 UTC m=+835.890522053" Feb 20 16:45:30 crc kubenswrapper[4697]: I0220 16:45:30.096089 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-vwprb" event={"ID":"7b12f1ce-9e0b-458b-990a-e38c2f2139c5","Type":"ContainerStarted","Data":"8a24848f1f1578b7a7ae9635360c1fe415443a1086650c6a8c867c5d3142849c"} Feb 20 16:45:30 crc kubenswrapper[4697]: I0220 16:45:30.117295 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-vwprb" podStartSLOduration=1.7076675080000001 podStartE2EDuration="6.117198879s" podCreationTimestamp="2026-02-20 16:45:24 +0000 UTC" firstStartedPulling="2026-02-20 16:45:24.967980629 +0000 UTC m=+832.748026037" lastFinishedPulling="2026-02-20 16:45:29.37751201 +0000 UTC m=+837.157557408" observedRunningTime="2026-02-20 16:45:30.113779873 +0000 UTC m=+837.893825321" watchObservedRunningTime="2026-02-20 16:45:30.117198879 +0000 UTC m=+837.897244297" Feb 20 16:45:34 crc kubenswrapper[4697]: I0220 16:45:34.602414 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mmnb6" Feb 20 16:45:34 crc kubenswrapper[4697]: I0220 16:45:34.947612 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:34 crc kubenswrapper[4697]: I0220 16:45:34.947660 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:34 crc kubenswrapper[4697]: I0220 16:45:34.953514 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:35 crc kubenswrapper[4697]: I0220 16:45:35.135634 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c865cc74f-r5qbl" Feb 20 16:45:35 crc kubenswrapper[4697]: I0220 16:45:35.192040 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-txsqk"] Feb 20 16:45:45 crc kubenswrapper[4697]: I0220 16:45:45.163672 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-xmzrc" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.279061 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f"] Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.281018 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.283923 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.293097 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f"] Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.447427 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w94hv\" (UniqueName: \"kubernetes.io/projected/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-kube-api-access-w94hv\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f\" (UID: \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.447540 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f\" (UID: \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.447588 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f\" (UID: \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.548478 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f\" (UID: \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.548567 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w94hv\" (UniqueName: \"kubernetes.io/projected/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-kube-api-access-w94hv\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f\" (UID: \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.548597 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f\" (UID: \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.549248 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f\" (UID: \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.549505 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f\" (UID: \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.592995 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w94hv\" (UniqueName: \"kubernetes.io/projected/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-kube-api-access-w94hv\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f\" (UID: \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.595843 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:45:58 crc kubenswrapper[4697]: I0220 16:45:58.806512 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f"] Feb 20 16:45:59 crc kubenswrapper[4697]: I0220 16:45:59.317572 4697 generic.go:334] "Generic (PLEG): container finished" podID="53d4ee21-f1e0-4bea-b34f-6ff260c092cd" containerID="b8edbb6f2349838885b5f4f70bbb7e756c2b860635dc9f94dbcc1b8e07c9100c" exitCode=0 Feb 20 16:45:59 crc kubenswrapper[4697]: I0220 16:45:59.317626 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" event={"ID":"53d4ee21-f1e0-4bea-b34f-6ff260c092cd","Type":"ContainerDied","Data":"b8edbb6f2349838885b5f4f70bbb7e756c2b860635dc9f94dbcc1b8e07c9100c"} Feb 20 16:45:59 crc kubenswrapper[4697]: I0220 16:45:59.317654 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" event={"ID":"53d4ee21-f1e0-4bea-b34f-6ff260c092cd","Type":"ContainerStarted","Data":"02d0e182c995dbd06181482f606a855f29343e0ba92bd25f2e51e9012bde2ec8"} Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.252022 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-txsqk" podUID="70f9d3b5-82e4-47b2-ba65-88980dc9b401" containerName="console" containerID="cri-o://56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0" gracePeriod=15 Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.712675 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-txsqk_70f9d3b5-82e4-47b2-ba65-88980dc9b401/console/0.log" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.712777 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.881390 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-oauth-serving-cert\") pod \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.881727 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-service-ca\") pod \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.881761 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-trusted-ca-bundle\") pod \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.881803 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-oauth-config\") pod \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.881827 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z42pt\" (UniqueName: \"kubernetes.io/projected/70f9d3b5-82e4-47b2-ba65-88980dc9b401-kube-api-access-z42pt\") pod \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.881859 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-serving-cert\") pod \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.881875 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-config\") pod \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\" (UID: \"70f9d3b5-82e4-47b2-ba65-88980dc9b401\") " Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.882459 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-config" (OuterVolumeSpecName: "console-config") pod "70f9d3b5-82e4-47b2-ba65-88980dc9b401" (UID: "70f9d3b5-82e4-47b2-ba65-88980dc9b401"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.882535 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "70f9d3b5-82e4-47b2-ba65-88980dc9b401" (UID: "70f9d3b5-82e4-47b2-ba65-88980dc9b401"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.882533 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "70f9d3b5-82e4-47b2-ba65-88980dc9b401" (UID: "70f9d3b5-82e4-47b2-ba65-88980dc9b401"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.883587 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-service-ca" (OuterVolumeSpecName: "service-ca") pod "70f9d3b5-82e4-47b2-ba65-88980dc9b401" (UID: "70f9d3b5-82e4-47b2-ba65-88980dc9b401"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.888965 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "70f9d3b5-82e4-47b2-ba65-88980dc9b401" (UID: "70f9d3b5-82e4-47b2-ba65-88980dc9b401"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.889521 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "70f9d3b5-82e4-47b2-ba65-88980dc9b401" (UID: "70f9d3b5-82e4-47b2-ba65-88980dc9b401"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.893899 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f9d3b5-82e4-47b2-ba65-88980dc9b401-kube-api-access-z42pt" (OuterVolumeSpecName: "kube-api-access-z42pt") pod "70f9d3b5-82e4-47b2-ba65-88980dc9b401" (UID: "70f9d3b5-82e4-47b2-ba65-88980dc9b401"). InnerVolumeSpecName "kube-api-access-z42pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.983649 4697 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.983704 4697 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.983718 4697 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.983732 4697 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.983743 4697 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70f9d3b5-82e4-47b2-ba65-88980dc9b401-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.983756 4697 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70f9d3b5-82e4-47b2-ba65-88980dc9b401-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:46:00 crc kubenswrapper[4697]: I0220 16:46:00.983787 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z42pt\" (UniqueName: \"kubernetes.io/projected/70f9d3b5-82e4-47b2-ba65-88980dc9b401-kube-api-access-z42pt\") on node \"crc\" DevicePath \"\"" Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.185192 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.185249 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.335173 4697 generic.go:334] "Generic (PLEG): container finished" podID="53d4ee21-f1e0-4bea-b34f-6ff260c092cd" containerID="224b468798ee63942dcd03e37884abd619f61599944993c418904ab6f523f11b" exitCode=0 Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.335240 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" event={"ID":"53d4ee21-f1e0-4bea-b34f-6ff260c092cd","Type":"ContainerDied","Data":"224b468798ee63942dcd03e37884abd619f61599944993c418904ab6f523f11b"} Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.337205 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-txsqk_70f9d3b5-82e4-47b2-ba65-88980dc9b401/console/0.log" Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.337240 4697 generic.go:334] "Generic (PLEG): container finished" podID="70f9d3b5-82e4-47b2-ba65-88980dc9b401" containerID="56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0" exitCode=2 Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.337258 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-txsqk" event={"ID":"70f9d3b5-82e4-47b2-ba65-88980dc9b401","Type":"ContainerDied","Data":"56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0"} Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.337272 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-txsqk" event={"ID":"70f9d3b5-82e4-47b2-ba65-88980dc9b401","Type":"ContainerDied","Data":"eaedf32cafb61b90d2b084bed9bbf7097316783584742857d68ecaa704f3077f"} Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.337289 4697 scope.go:117] "RemoveContainer" containerID="56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0" Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.337340 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-txsqk" Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.364123 4697 scope.go:117] "RemoveContainer" containerID="56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0" Feb 20 16:46:01 crc kubenswrapper[4697]: E0220 16:46:01.364490 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0\": container with ID starting with 56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0 not found: ID does not exist" containerID="56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0" Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.364545 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0"} err="failed to get container status \"56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0\": rpc error: code = NotFound desc = could not find container \"56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0\": container with ID starting with 56736819e874de53b7d2176e0a04f0f8a6ee75e8a9c4d8ada9c896d344f9f5d0 not found: ID does not exist" Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.458553 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-txsqk"] Feb 20 16:46:01 crc kubenswrapper[4697]: I0220 16:46:01.463583 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-txsqk"] Feb 20 16:46:02 crc kubenswrapper[4697]: I0220 16:46:02.346958 4697 generic.go:334] "Generic (PLEG): container finished" podID="53d4ee21-f1e0-4bea-b34f-6ff260c092cd" containerID="81227e6569421da0a521f69f0419dc2b30343ed4c5fb82afbd8e40f592649c92" exitCode=0 Feb 20 16:46:02 crc kubenswrapper[4697]: I0220 16:46:02.347033 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" event={"ID":"53d4ee21-f1e0-4bea-b34f-6ff260c092cd","Type":"ContainerDied","Data":"81227e6569421da0a521f69f0419dc2b30343ed4c5fb82afbd8e40f592649c92"} Feb 20 16:46:02 crc kubenswrapper[4697]: I0220 16:46:02.906781 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f9d3b5-82e4-47b2-ba65-88980dc9b401" path="/var/lib/kubelet/pods/70f9d3b5-82e4-47b2-ba65-88980dc9b401/volumes" Feb 20 16:46:03 crc kubenswrapper[4697]: I0220 16:46:03.603579 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:46:03 crc kubenswrapper[4697]: I0220 16:46:03.755209 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-bundle\") pod \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\" (UID: \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\") " Feb 20 16:46:03 crc kubenswrapper[4697]: I0220 16:46:03.755259 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-util\") pod \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\" (UID: \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\") " Feb 20 16:46:03 crc kubenswrapper[4697]: I0220 16:46:03.755333 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w94hv\" (UniqueName: \"kubernetes.io/projected/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-kube-api-access-w94hv\") pod \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\" (UID: \"53d4ee21-f1e0-4bea-b34f-6ff260c092cd\") " Feb 20 16:46:03 crc kubenswrapper[4697]: I0220 16:46:03.756787 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-bundle" (OuterVolumeSpecName: "bundle") pod "53d4ee21-f1e0-4bea-b34f-6ff260c092cd" (UID: "53d4ee21-f1e0-4bea-b34f-6ff260c092cd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:46:03 crc kubenswrapper[4697]: I0220 16:46:03.762520 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-kube-api-access-w94hv" (OuterVolumeSpecName: "kube-api-access-w94hv") pod "53d4ee21-f1e0-4bea-b34f-6ff260c092cd" (UID: "53d4ee21-f1e0-4bea-b34f-6ff260c092cd"). InnerVolumeSpecName "kube-api-access-w94hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:46:03 crc kubenswrapper[4697]: I0220 16:46:03.769302 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-util" (OuterVolumeSpecName: "util") pod "53d4ee21-f1e0-4bea-b34f-6ff260c092cd" (UID: "53d4ee21-f1e0-4bea-b34f-6ff260c092cd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:46:03 crc kubenswrapper[4697]: I0220 16:46:03.856472 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w94hv\" (UniqueName: \"kubernetes.io/projected/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-kube-api-access-w94hv\") on node \"crc\" DevicePath \"\"" Feb 20 16:46:03 crc kubenswrapper[4697]: I0220 16:46:03.856502 4697 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:46:03 crc kubenswrapper[4697]: I0220 16:46:03.856512 4697 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53d4ee21-f1e0-4bea-b34f-6ff260c092cd-util\") on node \"crc\" DevicePath \"\"" Feb 20 16:46:04 crc kubenswrapper[4697]: I0220 16:46:04.364130 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" event={"ID":"53d4ee21-f1e0-4bea-b34f-6ff260c092cd","Type":"ContainerDied","Data":"02d0e182c995dbd06181482f606a855f29343e0ba92bd25f2e51e9012bde2ec8"} Feb 20 16:46:04 crc kubenswrapper[4697]: I0220 16:46:04.364181 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d0e182c995dbd06181482f606a855f29343e0ba92bd25f2e51e9012bde2ec8" Feb 20 16:46:04 crc kubenswrapper[4697]: I0220 16:46:04.364198 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.764476 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g"] Feb 20 16:46:12 crc kubenswrapper[4697]: E0220 16:46:12.765290 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d4ee21-f1e0-4bea-b34f-6ff260c092cd" containerName="util" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.765316 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d4ee21-f1e0-4bea-b34f-6ff260c092cd" containerName="util" Feb 20 16:46:12 crc kubenswrapper[4697]: E0220 16:46:12.765330 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d4ee21-f1e0-4bea-b34f-6ff260c092cd" containerName="extract" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.765339 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d4ee21-f1e0-4bea-b34f-6ff260c092cd" containerName="extract" Feb 20 16:46:12 crc kubenswrapper[4697]: E0220 16:46:12.765353 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f9d3b5-82e4-47b2-ba65-88980dc9b401" containerName="console" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.765361 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f9d3b5-82e4-47b2-ba65-88980dc9b401" containerName="console" Feb 20 16:46:12 crc kubenswrapper[4697]: E0220 16:46:12.765371 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d4ee21-f1e0-4bea-b34f-6ff260c092cd" containerName="pull" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.765378 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d4ee21-f1e0-4bea-b34f-6ff260c092cd" containerName="pull" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.765502 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="53d4ee21-f1e0-4bea-b34f-6ff260c092cd" containerName="extract" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.765523 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f9d3b5-82e4-47b2-ba65-88980dc9b401" containerName="console" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.766014 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.768689 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-zrq8w" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.768748 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.768706 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.769053 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.774033 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.780172 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g"] Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.873102 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/538df488-7224-4d8b-a08c-463865282008-webhook-cert\") pod \"metallb-operator-controller-manager-6574bdbb48-rfq5g\" (UID: \"538df488-7224-4d8b-a08c-463865282008\") " pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.873146 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrfj9\" (UniqueName: \"kubernetes.io/projected/538df488-7224-4d8b-a08c-463865282008-kube-api-access-hrfj9\") pod \"metallb-operator-controller-manager-6574bdbb48-rfq5g\" (UID: \"538df488-7224-4d8b-a08c-463865282008\") " pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.873231 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/538df488-7224-4d8b-a08c-463865282008-apiservice-cert\") pod \"metallb-operator-controller-manager-6574bdbb48-rfq5g\" (UID: \"538df488-7224-4d8b-a08c-463865282008\") " pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.974069 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/538df488-7224-4d8b-a08c-463865282008-webhook-cert\") pod \"metallb-operator-controller-manager-6574bdbb48-rfq5g\" (UID: \"538df488-7224-4d8b-a08c-463865282008\") " pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.974121 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrfj9\" (UniqueName: \"kubernetes.io/projected/538df488-7224-4d8b-a08c-463865282008-kube-api-access-hrfj9\") pod \"metallb-operator-controller-manager-6574bdbb48-rfq5g\" (UID: \"538df488-7224-4d8b-a08c-463865282008\") " pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.974208 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/538df488-7224-4d8b-a08c-463865282008-apiservice-cert\") pod \"metallb-operator-controller-manager-6574bdbb48-rfq5g\" (UID: \"538df488-7224-4d8b-a08c-463865282008\") " pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.979319 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/538df488-7224-4d8b-a08c-463865282008-webhook-cert\") pod \"metallb-operator-controller-manager-6574bdbb48-rfq5g\" (UID: \"538df488-7224-4d8b-a08c-463865282008\") " pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.993037 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/538df488-7224-4d8b-a08c-463865282008-apiservice-cert\") pod \"metallb-operator-controller-manager-6574bdbb48-rfq5g\" (UID: \"538df488-7224-4d8b-a08c-463865282008\") " pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:12 crc kubenswrapper[4697]: I0220 16:46:12.994021 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrfj9\" (UniqueName: \"kubernetes.io/projected/538df488-7224-4d8b-a08c-463865282008-kube-api-access-hrfj9\") pod \"metallb-operator-controller-manager-6574bdbb48-rfq5g\" (UID: \"538df488-7224-4d8b-a08c-463865282008\") " pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.084134 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.189561 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv"] Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.190320 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.192590 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.192822 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.192959 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-4bdhr" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.211890 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv"] Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.342702 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g"] Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.377866 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50d070c4-1559-40ac-9375-96ddffeb6b1a-webhook-cert\") pod \"metallb-operator-webhook-server-77767df4c8-b6xsv\" (UID: \"50d070c4-1559-40ac-9375-96ddffeb6b1a\") " pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.378188 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50d070c4-1559-40ac-9375-96ddffeb6b1a-apiservice-cert\") pod \"metallb-operator-webhook-server-77767df4c8-b6xsv\" (UID: \"50d070c4-1559-40ac-9375-96ddffeb6b1a\") " pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.378216 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nncpp\" (UniqueName: \"kubernetes.io/projected/50d070c4-1559-40ac-9375-96ddffeb6b1a-kube-api-access-nncpp\") pod \"metallb-operator-webhook-server-77767df4c8-b6xsv\" (UID: \"50d070c4-1559-40ac-9375-96ddffeb6b1a\") " pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.407808 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" event={"ID":"538df488-7224-4d8b-a08c-463865282008","Type":"ContainerStarted","Data":"4471e391a1dc2f9884bf7cebcfb21ac4ae724e8f25b21b1aa5e5c7f5cbf657c8"} Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.478971 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50d070c4-1559-40ac-9375-96ddffeb6b1a-apiservice-cert\") pod \"metallb-operator-webhook-server-77767df4c8-b6xsv\" (UID: \"50d070c4-1559-40ac-9375-96ddffeb6b1a\") " pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.479038 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nncpp\" (UniqueName: \"kubernetes.io/projected/50d070c4-1559-40ac-9375-96ddffeb6b1a-kube-api-access-nncpp\") pod \"metallb-operator-webhook-server-77767df4c8-b6xsv\" (UID: \"50d070c4-1559-40ac-9375-96ddffeb6b1a\") " pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.479095 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50d070c4-1559-40ac-9375-96ddffeb6b1a-webhook-cert\") pod \"metallb-operator-webhook-server-77767df4c8-b6xsv\" (UID: \"50d070c4-1559-40ac-9375-96ddffeb6b1a\") " pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.483049 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50d070c4-1559-40ac-9375-96ddffeb6b1a-webhook-cert\") pod \"metallb-operator-webhook-server-77767df4c8-b6xsv\" (UID: \"50d070c4-1559-40ac-9375-96ddffeb6b1a\") " pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.483111 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50d070c4-1559-40ac-9375-96ddffeb6b1a-apiservice-cert\") pod \"metallb-operator-webhook-server-77767df4c8-b6xsv\" (UID: \"50d070c4-1559-40ac-9375-96ddffeb6b1a\") " pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.498863 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nncpp\" (UniqueName: \"kubernetes.io/projected/50d070c4-1559-40ac-9375-96ddffeb6b1a-kube-api-access-nncpp\") pod \"metallb-operator-webhook-server-77767df4c8-b6xsv\" (UID: \"50d070c4-1559-40ac-9375-96ddffeb6b1a\") " pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.523675 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:13 crc kubenswrapper[4697]: I0220 16:46:13.744185 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv"] Feb 20 16:46:13 crc kubenswrapper[4697]: W0220 16:46:13.746857 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50d070c4_1559_40ac_9375_96ddffeb6b1a.slice/crio-b4b360bfe8902fca0988f39744c032c2ad660bd284b9ad5bbd67f9cbca72e8be WatchSource:0}: Error finding container b4b360bfe8902fca0988f39744c032c2ad660bd284b9ad5bbd67f9cbca72e8be: Status 404 returned error can't find the container with id b4b360bfe8902fca0988f39744c032c2ad660bd284b9ad5bbd67f9cbca72e8be Feb 20 16:46:14 crc kubenswrapper[4697]: I0220 16:46:14.413678 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" event={"ID":"50d070c4-1559-40ac-9375-96ddffeb6b1a","Type":"ContainerStarted","Data":"b4b360bfe8902fca0988f39744c032c2ad660bd284b9ad5bbd67f9cbca72e8be"} Feb 20 16:46:17 crc kubenswrapper[4697]: I0220 16:46:17.430860 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" event={"ID":"538df488-7224-4d8b-a08c-463865282008","Type":"ContainerStarted","Data":"da1703924d95a37d921e888bd6c1d3efab653688645b53c3663bfdca2258ab98"} Feb 20 16:46:17 crc kubenswrapper[4697]: I0220 16:46:17.431479 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:17 crc kubenswrapper[4697]: I0220 16:46:17.451295 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" podStartSLOduration=2.047245938 podStartE2EDuration="5.451279325s" podCreationTimestamp="2026-02-20 16:46:12 +0000 UTC" firstStartedPulling="2026-02-20 16:46:13.363832153 +0000 UTC m=+881.143877561" lastFinishedPulling="2026-02-20 16:46:16.76786554 +0000 UTC m=+884.547910948" observedRunningTime="2026-02-20 16:46:17.44554469 +0000 UTC m=+885.225590098" watchObservedRunningTime="2026-02-20 16:46:17.451279325 +0000 UTC m=+885.231324733" Feb 20 16:46:18 crc kubenswrapper[4697]: I0220 16:46:18.437253 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" event={"ID":"50d070c4-1559-40ac-9375-96ddffeb6b1a","Type":"ContainerStarted","Data":"9338f843982c832103538be509db82a0f9c3457e2469b9ad8f8597a9375c5179"} Feb 20 16:46:18 crc kubenswrapper[4697]: I0220 16:46:18.454590 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" podStartSLOduration=0.977734909 podStartE2EDuration="5.454576446s" podCreationTimestamp="2026-02-20 16:46:13 +0000 UTC" firstStartedPulling="2026-02-20 16:46:13.772653719 +0000 UTC m=+881.552699207" lastFinishedPulling="2026-02-20 16:46:18.249495336 +0000 UTC m=+886.029540744" observedRunningTime="2026-02-20 16:46:18.451624052 +0000 UTC m=+886.231669460" watchObservedRunningTime="2026-02-20 16:46:18.454576446 +0000 UTC m=+886.234621854" Feb 20 16:46:19 crc kubenswrapper[4697]: I0220 16:46:19.443223 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:31 crc kubenswrapper[4697]: I0220 16:46:31.184712 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:46:31 crc kubenswrapper[4697]: I0220 16:46:31.185224 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:46:33 crc kubenswrapper[4697]: I0220 16:46:33.528407 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-77767df4c8-b6xsv" Feb 20 16:46:53 crc kubenswrapper[4697]: I0220 16:46:53.088057 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6574bdbb48-rfq5g" Feb 20 16:46:53 crc kubenswrapper[4697]: I0220 16:46:53.932072 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx"] Feb 20 16:46:53 crc kubenswrapper[4697]: I0220 16:46:53.933127 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" Feb 20 16:46:53 crc kubenswrapper[4697]: I0220 16:46:53.935242 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-cs65n" Feb 20 16:46:53 crc kubenswrapper[4697]: I0220 16:46:53.935668 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 20 16:46:53 crc kubenswrapper[4697]: I0220 16:46:53.945920 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx"] Feb 20 16:46:53 crc kubenswrapper[4697]: I0220 16:46:53.963538 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-rrd4g"] Feb 20 16:46:53 crc kubenswrapper[4697]: I0220 16:46:53.965587 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:53 crc kubenswrapper[4697]: I0220 16:46:53.966974 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 20 16:46:53 crc kubenswrapper[4697]: I0220 16:46:53.967467 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.028520 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-fzjzn"] Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.029542 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.034212 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-64qzx"] Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.035462 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.035806 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-ws2hb" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.035897 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.035933 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.036082 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.039920 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.041892 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-64qzx"] Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.094634 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-frr-startup\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.094693 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44tnf\" (UniqueName: \"kubernetes.io/projected/08c8d96c-3974-4e6f-ae8e-7283a628643e-kube-api-access-44tnf\") pod \"frr-k8s-webhook-server-78b44bf5bb-lk6qx\" (UID: \"08c8d96c-3974-4e6f-ae8e-7283a628643e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.094727 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-metrics-certs\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.094821 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-frr-conf\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.094854 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-metrics\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.094875 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qbbl\" (UniqueName: \"kubernetes.io/projected/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-kube-api-access-9qbbl\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.094897 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-frr-sockets\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.094936 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08c8d96c-3974-4e6f-ae8e-7283a628643e-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-lk6qx\" (UID: \"08c8d96c-3974-4e6f-ae8e-7283a628643e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.094957 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-reloader\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.195636 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08c8d96c-3974-4e6f-ae8e-7283a628643e-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-lk6qx\" (UID: \"08c8d96c-3974-4e6f-ae8e-7283a628643e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.195794 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-reloader\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.195813 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-frr-startup\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: E0220 16:46:54.195753 4697 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 20 16:46:54 crc kubenswrapper[4697]: E0220 16:46:54.195885 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08c8d96c-3974-4e6f-ae8e-7283a628643e-cert podName:08c8d96c-3974-4e6f-ae8e-7283a628643e nodeName:}" failed. No retries permitted until 2026-02-20 16:46:54.695864882 +0000 UTC m=+922.475910280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/08c8d96c-3974-4e6f-ae8e-7283a628643e-cert") pod "frr-k8s-webhook-server-78b44bf5bb-lk6qx" (UID: "08c8d96c-3974-4e6f-ae8e-7283a628643e") : secret "frr-k8s-webhook-server-cert" not found Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196140 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rrxf\" (UniqueName: \"kubernetes.io/projected/e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3-kube-api-access-7rrxf\") pod \"controller-69bbfbf88f-64qzx\" (UID: \"e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3\") " pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196163 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44tnf\" (UniqueName: \"kubernetes.io/projected/08c8d96c-3974-4e6f-ae8e-7283a628643e-kube-api-access-44tnf\") pod \"frr-k8s-webhook-server-78b44bf5bb-lk6qx\" (UID: \"08c8d96c-3974-4e6f-ae8e-7283a628643e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196208 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-reloader\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196225 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-metrics-certs\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196280 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d6d1b55-ac21-4967-9487-53b7b236b847-metrics-certs\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196302 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3-metrics-certs\") pod \"controller-69bbfbf88f-64qzx\" (UID: \"e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3\") " pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196320 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7d6d1b55-ac21-4967-9487-53b7b236b847-memberlist\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196381 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3-cert\") pod \"controller-69bbfbf88f-64qzx\" (UID: \"e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3\") " pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196420 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-frr-conf\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196714 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-frr-conf\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196762 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-metrics\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196785 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qbbl\" (UniqueName: \"kubernetes.io/projected/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-kube-api-access-9qbbl\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196942 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-frr-startup\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.196984 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-metrics\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.197028 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-frr-sockets\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.197047 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7d6d1b55-ac21-4967-9487-53b7b236b847-metallb-excludel2\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.197065 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crmxg\" (UniqueName: \"kubernetes.io/projected/7d6d1b55-ac21-4967-9487-53b7b236b847-kube-api-access-crmxg\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.197237 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-frr-sockets\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.201226 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-metrics-certs\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.215407 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qbbl\" (UniqueName: \"kubernetes.io/projected/531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7-kube-api-access-9qbbl\") pod \"frr-k8s-rrd4g\" (UID: \"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7\") " pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.215747 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44tnf\" (UniqueName: \"kubernetes.io/projected/08c8d96c-3974-4e6f-ae8e-7283a628643e-kube-api-access-44tnf\") pod \"frr-k8s-webhook-server-78b44bf5bb-lk6qx\" (UID: \"08c8d96c-3974-4e6f-ae8e-7283a628643e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.297844 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7d6d1b55-ac21-4967-9487-53b7b236b847-metallb-excludel2\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.297900 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crmxg\" (UniqueName: \"kubernetes.io/projected/7d6d1b55-ac21-4967-9487-53b7b236b847-kube-api-access-crmxg\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.297973 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rrxf\" (UniqueName: \"kubernetes.io/projected/e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3-kube-api-access-7rrxf\") pod \"controller-69bbfbf88f-64qzx\" (UID: \"e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3\") " pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.298029 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d6d1b55-ac21-4967-9487-53b7b236b847-metrics-certs\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.298058 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3-metrics-certs\") pod \"controller-69bbfbf88f-64qzx\" (UID: \"e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3\") " pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.298079 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7d6d1b55-ac21-4967-9487-53b7b236b847-memberlist\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.298118 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3-cert\") pod \"controller-69bbfbf88f-64qzx\" (UID: \"e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3\") " pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.298486 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7d6d1b55-ac21-4967-9487-53b7b236b847-metallb-excludel2\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: E0220 16:46:54.298581 4697 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 20 16:46:54 crc kubenswrapper[4697]: E0220 16:46:54.298635 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6d1b55-ac21-4967-9487-53b7b236b847-memberlist podName:7d6d1b55-ac21-4967-9487-53b7b236b847 nodeName:}" failed. No retries permitted until 2026-02-20 16:46:54.798618406 +0000 UTC m=+922.578663904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7d6d1b55-ac21-4967-9487-53b7b236b847-memberlist") pod "speaker-fzjzn" (UID: "7d6d1b55-ac21-4967-9487-53b7b236b847") : secret "metallb-memberlist" not found Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.300331 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.301635 4697 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.303413 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d6d1b55-ac21-4967-9487-53b7b236b847-metrics-certs\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.304821 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3-metrics-certs\") pod \"controller-69bbfbf88f-64qzx\" (UID: \"e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3\") " pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.312282 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3-cert\") pod \"controller-69bbfbf88f-64qzx\" (UID: \"e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3\") " pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.312706 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crmxg\" (UniqueName: \"kubernetes.io/projected/7d6d1b55-ac21-4967-9487-53b7b236b847-kube-api-access-crmxg\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.315496 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rrxf\" (UniqueName: \"kubernetes.io/projected/e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3-kube-api-access-7rrxf\") pod \"controller-69bbfbf88f-64qzx\" (UID: \"e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3\") " pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.361824 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.666020 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rrd4g" event={"ID":"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7","Type":"ContainerStarted","Data":"e239432e7d436a8456eda2b8a2b4c7d27c9e52ad6ab5c305c9486f78dfb69c25"} Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.703394 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08c8d96c-3974-4e6f-ae8e-7283a628643e-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-lk6qx\" (UID: \"08c8d96c-3974-4e6f-ae8e-7283a628643e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.711005 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08c8d96c-3974-4e6f-ae8e-7283a628643e-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-lk6qx\" (UID: \"08c8d96c-3974-4e6f-ae8e-7283a628643e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.764088 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-64qzx"] Feb 20 16:46:54 crc kubenswrapper[4697]: W0220 16:46:54.772884 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2a9bc57_0e8b_4eb5_a28c_109e164aa8d3.slice/crio-d5bfd4d1fe7a7d3066f91a7497a5693e28acc58e3ddac885528d199baf64c110 WatchSource:0}: Error finding container d5bfd4d1fe7a7d3066f91a7497a5693e28acc58e3ddac885528d199baf64c110: Status 404 returned error can't find the container with id d5bfd4d1fe7a7d3066f91a7497a5693e28acc58e3ddac885528d199baf64c110 Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.805159 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7d6d1b55-ac21-4967-9487-53b7b236b847-memberlist\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:54 crc kubenswrapper[4697]: E0220 16:46:54.805311 4697 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 20 16:46:54 crc kubenswrapper[4697]: E0220 16:46:54.805745 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6d1b55-ac21-4967-9487-53b7b236b847-memberlist podName:7d6d1b55-ac21-4967-9487-53b7b236b847 nodeName:}" failed. No retries permitted until 2026-02-20 16:46:55.805727406 +0000 UTC m=+923.585772814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7d6d1b55-ac21-4967-9487-53b7b236b847-memberlist") pod "speaker-fzjzn" (UID: "7d6d1b55-ac21-4967-9487-53b7b236b847") : secret "metallb-memberlist" not found Feb 20 16:46:54 crc kubenswrapper[4697]: I0220 16:46:54.869772 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" Feb 20 16:46:55 crc kubenswrapper[4697]: I0220 16:46:55.302173 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx"] Feb 20 16:46:55 crc kubenswrapper[4697]: W0220 16:46:55.310514 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08c8d96c_3974_4e6f_ae8e_7283a628643e.slice/crio-7b197e0524bd2be6d0fffacafbbfa2d5f4bdf6ecfec29169d9c4192f9c8505c2 WatchSource:0}: Error finding container 7b197e0524bd2be6d0fffacafbbfa2d5f4bdf6ecfec29169d9c4192f9c8505c2: Status 404 returned error can't find the container with id 7b197e0524bd2be6d0fffacafbbfa2d5f4bdf6ecfec29169d9c4192f9c8505c2 Feb 20 16:46:55 crc kubenswrapper[4697]: I0220 16:46:55.673182 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" event={"ID":"08c8d96c-3974-4e6f-ae8e-7283a628643e","Type":"ContainerStarted","Data":"7b197e0524bd2be6d0fffacafbbfa2d5f4bdf6ecfec29169d9c4192f9c8505c2"} Feb 20 16:46:55 crc kubenswrapper[4697]: I0220 16:46:55.676171 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-64qzx" event={"ID":"e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3","Type":"ContainerStarted","Data":"edf99cc793f2c42d0e99a5bfd6ab7e2ad2255c3034182915ae255b5b2ca27cbc"} Feb 20 16:46:55 crc kubenswrapper[4697]: I0220 16:46:55.676258 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-64qzx" event={"ID":"e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3","Type":"ContainerStarted","Data":"f54c09fe64e16750e95fd21bec2a480c891c0eddeceb8c3bef9b311bd4357674"} Feb 20 16:46:55 crc kubenswrapper[4697]: I0220 16:46:55.676276 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-64qzx" event={"ID":"e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3","Type":"ContainerStarted","Data":"d5bfd4d1fe7a7d3066f91a7497a5693e28acc58e3ddac885528d199baf64c110"} Feb 20 16:46:55 crc kubenswrapper[4697]: I0220 16:46:55.677486 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:46:55 crc kubenswrapper[4697]: I0220 16:46:55.696138 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-64qzx" podStartSLOduration=1.69612221 podStartE2EDuration="1.69612221s" podCreationTimestamp="2026-02-20 16:46:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:46:55.692523189 +0000 UTC m=+923.472568597" watchObservedRunningTime="2026-02-20 16:46:55.69612221 +0000 UTC m=+923.476167618" Feb 20 16:46:55 crc kubenswrapper[4697]: I0220 16:46:55.818270 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7d6d1b55-ac21-4967-9487-53b7b236b847-memberlist\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:55 crc kubenswrapper[4697]: I0220 16:46:55.827002 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7d6d1b55-ac21-4967-9487-53b7b236b847-memberlist\") pod \"speaker-fzjzn\" (UID: \"7d6d1b55-ac21-4967-9487-53b7b236b847\") " pod="metallb-system/speaker-fzjzn" Feb 20 16:46:55 crc kubenswrapper[4697]: I0220 16:46:55.848884 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-fzjzn" Feb 20 16:46:55 crc kubenswrapper[4697]: W0220 16:46:55.876332 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d6d1b55_ac21_4967_9487_53b7b236b847.slice/crio-5076307ed3e4467ef8f5b9f476f8c0434eb37d7c09dfaf5b7fedd347672b002c WatchSource:0}: Error finding container 5076307ed3e4467ef8f5b9f476f8c0434eb37d7c09dfaf5b7fedd347672b002c: Status 404 returned error can't find the container with id 5076307ed3e4467ef8f5b9f476f8c0434eb37d7c09dfaf5b7fedd347672b002c Feb 20 16:46:56 crc kubenswrapper[4697]: I0220 16:46:56.696001 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fzjzn" event={"ID":"7d6d1b55-ac21-4967-9487-53b7b236b847","Type":"ContainerStarted","Data":"1eb6aa3cff28fed0cd5eadd90c7422b26ee5966909cca4d1f4220d9c1bf52452"} Feb 20 16:46:56 crc kubenswrapper[4697]: I0220 16:46:56.696356 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fzjzn" event={"ID":"7d6d1b55-ac21-4967-9487-53b7b236b847","Type":"ContainerStarted","Data":"0dc34d88188bb367fc4a70d5a4c8727caf9beba2868786337cea151f195779b6"} Feb 20 16:46:56 crc kubenswrapper[4697]: I0220 16:46:56.696370 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fzjzn" event={"ID":"7d6d1b55-ac21-4967-9487-53b7b236b847","Type":"ContainerStarted","Data":"5076307ed3e4467ef8f5b9f476f8c0434eb37d7c09dfaf5b7fedd347672b002c"} Feb 20 16:46:56 crc kubenswrapper[4697]: I0220 16:46:56.696972 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-fzjzn" Feb 20 16:46:56 crc kubenswrapper[4697]: I0220 16:46:56.720626 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-fzjzn" podStartSLOduration=3.720606729 podStartE2EDuration="3.720606729s" podCreationTimestamp="2026-02-20 16:46:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:46:56.715212373 +0000 UTC m=+924.495257781" watchObservedRunningTime="2026-02-20 16:46:56.720606729 +0000 UTC m=+924.500652137" Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.184607 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.185112 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.185155 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.185730 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac1ff61636e81a13d334b99986c31ee9bcf221f2d7263a9112ad988ea78c70f4"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.185775 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://ac1ff61636e81a13d334b99986c31ee9bcf221f2d7263a9112ad988ea78c70f4" gracePeriod=600 Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.734539 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" event={"ID":"08c8d96c-3974-4e6f-ae8e-7283a628643e","Type":"ContainerStarted","Data":"471244a73236c2c3b0f90f483e00b8a1d8cca98039d4ff3ec26a59b2c6aba525"} Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.735189 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.738705 4697 generic.go:334] "Generic (PLEG): container finished" podID="531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7" containerID="d9fc36a5966de6fdd8f8071a14949deead4c14a101ac35c4de2b19373f5ef9f7" exitCode=0 Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.739169 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rrd4g" event={"ID":"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7","Type":"ContainerDied","Data":"d9fc36a5966de6fdd8f8071a14949deead4c14a101ac35c4de2b19373f5ef9f7"} Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.743333 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="ac1ff61636e81a13d334b99986c31ee9bcf221f2d7263a9112ad988ea78c70f4" exitCode=0 Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.743376 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"ac1ff61636e81a13d334b99986c31ee9bcf221f2d7263a9112ad988ea78c70f4"} Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.743463 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"7914ab01639074e62ef59025fd1a7ee59be0ddca912a97f841ccc347f5312e8e"} Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.743491 4697 scope.go:117] "RemoveContainer" containerID="a39e2b324782ff79e96c097bdf5a12c5992709cf28743465f5ab68009a413113" Feb 20 16:47:01 crc kubenswrapper[4697]: I0220 16:47:01.761048 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" podStartSLOduration=2.515167473 podStartE2EDuration="8.761031416s" podCreationTimestamp="2026-02-20 16:46:53 +0000 UTC" firstStartedPulling="2026-02-20 16:46:55.311732558 +0000 UTC m=+923.091777966" lastFinishedPulling="2026-02-20 16:47:01.557596501 +0000 UTC m=+929.337641909" observedRunningTime="2026-02-20 16:47:01.752233354 +0000 UTC m=+929.532278782" watchObservedRunningTime="2026-02-20 16:47:01.761031416 +0000 UTC m=+929.541076824" Feb 20 16:47:02 crc kubenswrapper[4697]: I0220 16:47:02.756383 4697 generic.go:334] "Generic (PLEG): container finished" podID="531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7" containerID="d9b72fbc278f164709efd21f0627f1729603faefcd4cb50247f80e6aee1a694d" exitCode=0 Feb 20 16:47:02 crc kubenswrapper[4697]: I0220 16:47:02.756427 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rrd4g" event={"ID":"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7","Type":"ContainerDied","Data":"d9b72fbc278f164709efd21f0627f1729603faefcd4cb50247f80e6aee1a694d"} Feb 20 16:47:03 crc kubenswrapper[4697]: I0220 16:47:03.769737 4697 generic.go:334] "Generic (PLEG): container finished" podID="531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7" containerID="13314a345d253391d2a397945b5598ece046128cb87e5ce9f59ea4bf8c19b8ba" exitCode=0 Feb 20 16:47:03 crc kubenswrapper[4697]: I0220 16:47:03.769879 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rrd4g" event={"ID":"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7","Type":"ContainerDied","Data":"13314a345d253391d2a397945b5598ece046128cb87e5ce9f59ea4bf8c19b8ba"} Feb 20 16:47:04 crc kubenswrapper[4697]: I0220 16:47:04.368302 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-64qzx" Feb 20 16:47:04 crc kubenswrapper[4697]: I0220 16:47:04.783681 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rrd4g" event={"ID":"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7","Type":"ContainerStarted","Data":"7b75755cb701bbcc409102eca2d33302078fd680a4cf16d507e4a5dcfd2ee955"} Feb 20 16:47:04 crc kubenswrapper[4697]: I0220 16:47:04.783731 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rrd4g" event={"ID":"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7","Type":"ContainerStarted","Data":"2bd87978e2824adef6b9126a38d8498047050043856e8aaa62f50d1df224ac6f"} Feb 20 16:47:04 crc kubenswrapper[4697]: I0220 16:47:04.783740 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rrd4g" event={"ID":"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7","Type":"ContainerStarted","Data":"0a90a574e33a35bf11dc85654748a47adf70676b563c44a9f0ce5db69857aee3"} Feb 20 16:47:04 crc kubenswrapper[4697]: I0220 16:47:04.783761 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rrd4g" event={"ID":"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7","Type":"ContainerStarted","Data":"e18681aea7cb1531c535b0b4bbc2b95166b48dc800f50ead97683b55d123ff10"} Feb 20 16:47:04 crc kubenswrapper[4697]: I0220 16:47:04.783771 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rrd4g" event={"ID":"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7","Type":"ContainerStarted","Data":"b5d52be5409c7482ad76e3f7d646dbd65374641d0fc703835af1921e48cd96cb"} Feb 20 16:47:04 crc kubenswrapper[4697]: I0220 16:47:04.783780 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rrd4g" event={"ID":"531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7","Type":"ContainerStarted","Data":"901af86d48059163aff34239c60dc61bf88bba5ff98a98889dde8ca7eca48bf9"} Feb 20 16:47:04 crc kubenswrapper[4697]: I0220 16:47:04.783813 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:47:04 crc kubenswrapper[4697]: I0220 16:47:04.814897 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-rrd4g" podStartSLOduration=4.721971196 podStartE2EDuration="11.814882549s" podCreationTimestamp="2026-02-20 16:46:53 +0000 UTC" firstStartedPulling="2026-02-20 16:46:54.465651482 +0000 UTC m=+922.245696890" lastFinishedPulling="2026-02-20 16:47:01.558562835 +0000 UTC m=+929.338608243" observedRunningTime="2026-02-20 16:47:04.810510478 +0000 UTC m=+932.590555886" watchObservedRunningTime="2026-02-20 16:47:04.814882549 +0000 UTC m=+932.594927957" Feb 20 16:47:09 crc kubenswrapper[4697]: I0220 16:47:09.301018 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:47:09 crc kubenswrapper[4697]: I0220 16:47:09.338961 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:47:14 crc kubenswrapper[4697]: I0220 16:47:14.304075 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-rrd4g" Feb 20 16:47:14 crc kubenswrapper[4697]: I0220 16:47:14.927249 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lk6qx" Feb 20 16:47:15 crc kubenswrapper[4697]: I0220 16:47:15.853098 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-fzjzn" Feb 20 16:47:18 crc kubenswrapper[4697]: I0220 16:47:18.428137 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-j9j4b"] Feb 20 16:47:18 crc kubenswrapper[4697]: I0220 16:47:18.429315 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j9j4b" Feb 20 16:47:18 crc kubenswrapper[4697]: I0220 16:47:18.431065 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 20 16:47:18 crc kubenswrapper[4697]: I0220 16:47:18.431674 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 20 16:47:18 crc kubenswrapper[4697]: I0220 16:47:18.431846 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-t2nrk" Feb 20 16:47:18 crc kubenswrapper[4697]: I0220 16:47:18.448553 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-j9j4b"] Feb 20 16:47:18 crc kubenswrapper[4697]: I0220 16:47:18.622419 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrngv\" (UniqueName: \"kubernetes.io/projected/5fe14b17-1ee4-47c2-9169-d4db95001cf0-kube-api-access-rrngv\") pod \"openstack-operator-index-j9j4b\" (UID: \"5fe14b17-1ee4-47c2-9169-d4db95001cf0\") " pod="openstack-operators/openstack-operator-index-j9j4b" Feb 20 16:47:18 crc kubenswrapper[4697]: I0220 16:47:18.723345 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrngv\" (UniqueName: \"kubernetes.io/projected/5fe14b17-1ee4-47c2-9169-d4db95001cf0-kube-api-access-rrngv\") pod \"openstack-operator-index-j9j4b\" (UID: \"5fe14b17-1ee4-47c2-9169-d4db95001cf0\") " pod="openstack-operators/openstack-operator-index-j9j4b" Feb 20 16:47:18 crc kubenswrapper[4697]: I0220 16:47:18.741600 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrngv\" (UniqueName: \"kubernetes.io/projected/5fe14b17-1ee4-47c2-9169-d4db95001cf0-kube-api-access-rrngv\") pod \"openstack-operator-index-j9j4b\" (UID: \"5fe14b17-1ee4-47c2-9169-d4db95001cf0\") " pod="openstack-operators/openstack-operator-index-j9j4b" Feb 20 16:47:18 crc kubenswrapper[4697]: I0220 16:47:18.749397 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j9j4b" Feb 20 16:47:18 crc kubenswrapper[4697]: I0220 16:47:18.932419 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-j9j4b"] Feb 20 16:47:18 crc kubenswrapper[4697]: W0220 16:47:18.935611 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe14b17_1ee4_47c2_9169_d4db95001cf0.slice/crio-9d0d2cc5d1d9d814102e1a144d0b76815f67096ee487a25f325860eddb1a6bce WatchSource:0}: Error finding container 9d0d2cc5d1d9d814102e1a144d0b76815f67096ee487a25f325860eddb1a6bce: Status 404 returned error can't find the container with id 9d0d2cc5d1d9d814102e1a144d0b76815f67096ee487a25f325860eddb1a6bce Feb 20 16:47:19 crc kubenswrapper[4697]: I0220 16:47:19.891158 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j9j4b" event={"ID":"5fe14b17-1ee4-47c2-9169-d4db95001cf0","Type":"ContainerStarted","Data":"9d0d2cc5d1d9d814102e1a144d0b76815f67096ee487a25f325860eddb1a6bce"} Feb 20 16:47:21 crc kubenswrapper[4697]: I0220 16:47:21.813383 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-j9j4b"] Feb 20 16:47:21 crc kubenswrapper[4697]: I0220 16:47:21.903104 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j9j4b" event={"ID":"5fe14b17-1ee4-47c2-9169-d4db95001cf0","Type":"ContainerStarted","Data":"e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b"} Feb 20 16:47:21 crc kubenswrapper[4697]: I0220 16:47:21.916672 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-j9j4b" podStartSLOduration=1.9426937130000002 podStartE2EDuration="3.916658027s" podCreationTimestamp="2026-02-20 16:47:18 +0000 UTC" firstStartedPulling="2026-02-20 16:47:18.939023619 +0000 UTC m=+946.719069027" lastFinishedPulling="2026-02-20 16:47:20.912987943 +0000 UTC m=+948.693033341" observedRunningTime="2026-02-20 16:47:21.913940699 +0000 UTC m=+949.693986107" watchObservedRunningTime="2026-02-20 16:47:21.916658027 +0000 UTC m=+949.696703435" Feb 20 16:47:22 crc kubenswrapper[4697]: I0220 16:47:22.423052 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-4nlwj"] Feb 20 16:47:22 crc kubenswrapper[4697]: I0220 16:47:22.424146 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4nlwj" Feb 20 16:47:22 crc kubenswrapper[4697]: I0220 16:47:22.436696 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4nlwj"] Feb 20 16:47:22 crc kubenswrapper[4697]: I0220 16:47:22.587290 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqwjq\" (UniqueName: \"kubernetes.io/projected/b3087f12-ba44-4ef2-af22-3d77e30b1d84-kube-api-access-xqwjq\") pod \"openstack-operator-index-4nlwj\" (UID: \"b3087f12-ba44-4ef2-af22-3d77e30b1d84\") " pod="openstack-operators/openstack-operator-index-4nlwj" Feb 20 16:47:22 crc kubenswrapper[4697]: I0220 16:47:22.688709 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqwjq\" (UniqueName: \"kubernetes.io/projected/b3087f12-ba44-4ef2-af22-3d77e30b1d84-kube-api-access-xqwjq\") pod \"openstack-operator-index-4nlwj\" (UID: \"b3087f12-ba44-4ef2-af22-3d77e30b1d84\") " pod="openstack-operators/openstack-operator-index-4nlwj" Feb 20 16:47:22 crc kubenswrapper[4697]: I0220 16:47:22.711209 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqwjq\" (UniqueName: \"kubernetes.io/projected/b3087f12-ba44-4ef2-af22-3d77e30b1d84-kube-api-access-xqwjq\") pod \"openstack-operator-index-4nlwj\" (UID: \"b3087f12-ba44-4ef2-af22-3d77e30b1d84\") " pod="openstack-operators/openstack-operator-index-4nlwj" Feb 20 16:47:22 crc kubenswrapper[4697]: I0220 16:47:22.743806 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4nlwj" Feb 20 16:47:22 crc kubenswrapper[4697]: I0220 16:47:22.909037 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-j9j4b" podUID="5fe14b17-1ee4-47c2-9169-d4db95001cf0" containerName="registry-server" containerID="cri-o://e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b" gracePeriod=2 Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.153123 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4nlwj"] Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.224409 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j9j4b" Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.399038 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrngv\" (UniqueName: \"kubernetes.io/projected/5fe14b17-1ee4-47c2-9169-d4db95001cf0-kube-api-access-rrngv\") pod \"5fe14b17-1ee4-47c2-9169-d4db95001cf0\" (UID: \"5fe14b17-1ee4-47c2-9169-d4db95001cf0\") " Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.408721 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe14b17-1ee4-47c2-9169-d4db95001cf0-kube-api-access-rrngv" (OuterVolumeSpecName: "kube-api-access-rrngv") pod "5fe14b17-1ee4-47c2-9169-d4db95001cf0" (UID: "5fe14b17-1ee4-47c2-9169-d4db95001cf0"). InnerVolumeSpecName "kube-api-access-rrngv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.500817 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrngv\" (UniqueName: \"kubernetes.io/projected/5fe14b17-1ee4-47c2-9169-d4db95001cf0-kube-api-access-rrngv\") on node \"crc\" DevicePath \"\"" Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.915785 4697 generic.go:334] "Generic (PLEG): container finished" podID="5fe14b17-1ee4-47c2-9169-d4db95001cf0" containerID="e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b" exitCode=0 Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.915836 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j9j4b" Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.915849 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j9j4b" event={"ID":"5fe14b17-1ee4-47c2-9169-d4db95001cf0","Type":"ContainerDied","Data":"e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b"} Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.915872 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j9j4b" event={"ID":"5fe14b17-1ee4-47c2-9169-d4db95001cf0","Type":"ContainerDied","Data":"9d0d2cc5d1d9d814102e1a144d0b76815f67096ee487a25f325860eddb1a6bce"} Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.915886 4697 scope.go:117] "RemoveContainer" containerID="e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b" Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.918145 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4nlwj" event={"ID":"b3087f12-ba44-4ef2-af22-3d77e30b1d84","Type":"ContainerStarted","Data":"cfb89cbf41ada82a126c00538230b145609d35650e2862e5fa236d98a27e3cec"} Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.918165 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4nlwj" event={"ID":"b3087f12-ba44-4ef2-af22-3d77e30b1d84","Type":"ContainerStarted","Data":"e44ae30b38ca1f25992f49d0c22529a7405a98a1f9a0777a419ff6d26159eec8"} Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.932305 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-4nlwj" podStartSLOduration=1.882831236 podStartE2EDuration="1.932288204s" podCreationTimestamp="2026-02-20 16:47:22 +0000 UTC" firstStartedPulling="2026-02-20 16:47:23.172417164 +0000 UTC m=+950.952462572" lastFinishedPulling="2026-02-20 16:47:23.221874132 +0000 UTC m=+951.001919540" observedRunningTime="2026-02-20 16:47:23.930487979 +0000 UTC m=+951.710533387" watchObservedRunningTime="2026-02-20 16:47:23.932288204 +0000 UTC m=+951.712333622" Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.944473 4697 scope.go:117] "RemoveContainer" containerID="e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b" Feb 20 16:47:23 crc kubenswrapper[4697]: E0220 16:47:23.946183 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b\": container with ID starting with e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b not found: ID does not exist" containerID="e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b" Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.946226 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b"} err="failed to get container status \"e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b\": rpc error: code = NotFound desc = could not find container \"e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b\": container with ID starting with e885267244aaddbc909edcc7d684448f04c3f4d7b1e9fb2c984446327732b83b not found: ID does not exist" Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.950784 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-j9j4b"] Feb 20 16:47:23 crc kubenswrapper[4697]: I0220 16:47:23.954354 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-j9j4b"] Feb 20 16:47:24 crc kubenswrapper[4697]: I0220 16:47:24.886117 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe14b17-1ee4-47c2-9169-d4db95001cf0" path="/var/lib/kubelet/pods/5fe14b17-1ee4-47c2-9169-d4db95001cf0/volumes" Feb 20 16:47:32 crc kubenswrapper[4697]: I0220 16:47:32.744765 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-4nlwj" Feb 20 16:47:32 crc kubenswrapper[4697]: I0220 16:47:32.745470 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-4nlwj" Feb 20 16:47:32 crc kubenswrapper[4697]: I0220 16:47:32.789515 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-4nlwj" Feb 20 16:47:33 crc kubenswrapper[4697]: I0220 16:47:33.033158 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-4nlwj" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.454900 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb"] Feb 20 16:47:34 crc kubenswrapper[4697]: E0220 16:47:34.455132 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe14b17-1ee4-47c2-9169-d4db95001cf0" containerName="registry-server" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.455157 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe14b17-1ee4-47c2-9169-d4db95001cf0" containerName="registry-server" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.455280 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fe14b17-1ee4-47c2-9169-d4db95001cf0" containerName="registry-server" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.456231 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.457690 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-hwcs9" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.472801 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb"] Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.657673 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/102f803e-192f-41ae-8742-2c9ba8ad7806-util\") pod \"d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb\" (UID: \"102f803e-192f-41ae-8742-2c9ba8ad7806\") " pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.657737 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/102f803e-192f-41ae-8742-2c9ba8ad7806-bundle\") pod \"d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb\" (UID: \"102f803e-192f-41ae-8742-2c9ba8ad7806\") " pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.657848 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7qww\" (UniqueName: \"kubernetes.io/projected/102f803e-192f-41ae-8742-2c9ba8ad7806-kube-api-access-r7qww\") pod \"d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb\" (UID: \"102f803e-192f-41ae-8742-2c9ba8ad7806\") " pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.758783 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/102f803e-192f-41ae-8742-2c9ba8ad7806-util\") pod \"d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb\" (UID: \"102f803e-192f-41ae-8742-2c9ba8ad7806\") " pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.758843 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/102f803e-192f-41ae-8742-2c9ba8ad7806-bundle\") pod \"d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb\" (UID: \"102f803e-192f-41ae-8742-2c9ba8ad7806\") " pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.758916 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7qww\" (UniqueName: \"kubernetes.io/projected/102f803e-192f-41ae-8742-2c9ba8ad7806-kube-api-access-r7qww\") pod \"d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb\" (UID: \"102f803e-192f-41ae-8742-2c9ba8ad7806\") " pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.759288 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/102f803e-192f-41ae-8742-2c9ba8ad7806-util\") pod \"d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb\" (UID: \"102f803e-192f-41ae-8742-2c9ba8ad7806\") " pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.759372 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/102f803e-192f-41ae-8742-2c9ba8ad7806-bundle\") pod \"d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb\" (UID: \"102f803e-192f-41ae-8742-2c9ba8ad7806\") " pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:34 crc kubenswrapper[4697]: I0220 16:47:34.777451 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7qww\" (UniqueName: \"kubernetes.io/projected/102f803e-192f-41ae-8742-2c9ba8ad7806-kube-api-access-r7qww\") pod \"d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb\" (UID: \"102f803e-192f-41ae-8742-2c9ba8ad7806\") " pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:35 crc kubenswrapper[4697]: I0220 16:47:35.070738 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:35 crc kubenswrapper[4697]: I0220 16:47:35.476347 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb"] Feb 20 16:47:35 crc kubenswrapper[4697]: W0220 16:47:35.481630 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod102f803e_192f_41ae_8742_2c9ba8ad7806.slice/crio-4c3b9321936d0bee5c5eb8d0fb116ac7d372144804caf3547288761246f405e5 WatchSource:0}: Error finding container 4c3b9321936d0bee5c5eb8d0fb116ac7d372144804caf3547288761246f405e5: Status 404 returned error can't find the container with id 4c3b9321936d0bee5c5eb8d0fb116ac7d372144804caf3547288761246f405e5 Feb 20 16:47:36 crc kubenswrapper[4697]: I0220 16:47:36.024758 4697 generic.go:334] "Generic (PLEG): container finished" podID="102f803e-192f-41ae-8742-2c9ba8ad7806" containerID="4399df0deef8163d51b5878735f9e329c327cb5b8d22866968169dd58143d54b" exitCode=0 Feb 20 16:47:36 crc kubenswrapper[4697]: I0220 16:47:36.024943 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" event={"ID":"102f803e-192f-41ae-8742-2c9ba8ad7806","Type":"ContainerDied","Data":"4399df0deef8163d51b5878735f9e329c327cb5b8d22866968169dd58143d54b"} Feb 20 16:47:36 crc kubenswrapper[4697]: I0220 16:47:36.026705 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" event={"ID":"102f803e-192f-41ae-8742-2c9ba8ad7806","Type":"ContainerStarted","Data":"4c3b9321936d0bee5c5eb8d0fb116ac7d372144804caf3547288761246f405e5"} Feb 20 16:47:37 crc kubenswrapper[4697]: I0220 16:47:37.037975 4697 generic.go:334] "Generic (PLEG): container finished" podID="102f803e-192f-41ae-8742-2c9ba8ad7806" containerID="d62445a9721f1dec18c631e047a3dcd6674bac4f59270622252e25efa8a8bd07" exitCode=0 Feb 20 16:47:37 crc kubenswrapper[4697]: I0220 16:47:37.038025 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" event={"ID":"102f803e-192f-41ae-8742-2c9ba8ad7806","Type":"ContainerDied","Data":"d62445a9721f1dec18c631e047a3dcd6674bac4f59270622252e25efa8a8bd07"} Feb 20 16:47:38 crc kubenswrapper[4697]: I0220 16:47:38.045748 4697 generic.go:334] "Generic (PLEG): container finished" podID="102f803e-192f-41ae-8742-2c9ba8ad7806" containerID="056507871c7f7fed1ee03bdea0b7434fce4e05dbf31af385269d5b85ade87410" exitCode=0 Feb 20 16:47:38 crc kubenswrapper[4697]: I0220 16:47:38.045808 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" event={"ID":"102f803e-192f-41ae-8742-2c9ba8ad7806","Type":"ContainerDied","Data":"056507871c7f7fed1ee03bdea0b7434fce4e05dbf31af385269d5b85ade87410"} Feb 20 16:47:39 crc kubenswrapper[4697]: I0220 16:47:39.356140 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:39 crc kubenswrapper[4697]: I0220 16:47:39.531900 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/102f803e-192f-41ae-8742-2c9ba8ad7806-bundle\") pod \"102f803e-192f-41ae-8742-2c9ba8ad7806\" (UID: \"102f803e-192f-41ae-8742-2c9ba8ad7806\") " Feb 20 16:47:39 crc kubenswrapper[4697]: I0220 16:47:39.532084 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/102f803e-192f-41ae-8742-2c9ba8ad7806-util\") pod \"102f803e-192f-41ae-8742-2c9ba8ad7806\" (UID: \"102f803e-192f-41ae-8742-2c9ba8ad7806\") " Feb 20 16:47:39 crc kubenswrapper[4697]: I0220 16:47:39.532121 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7qww\" (UniqueName: \"kubernetes.io/projected/102f803e-192f-41ae-8742-2c9ba8ad7806-kube-api-access-r7qww\") pod \"102f803e-192f-41ae-8742-2c9ba8ad7806\" (UID: \"102f803e-192f-41ae-8742-2c9ba8ad7806\") " Feb 20 16:47:39 crc kubenswrapper[4697]: I0220 16:47:39.533159 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/102f803e-192f-41ae-8742-2c9ba8ad7806-bundle" (OuterVolumeSpecName: "bundle") pod "102f803e-192f-41ae-8742-2c9ba8ad7806" (UID: "102f803e-192f-41ae-8742-2c9ba8ad7806"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:47:39 crc kubenswrapper[4697]: I0220 16:47:39.540591 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/102f803e-192f-41ae-8742-2c9ba8ad7806-kube-api-access-r7qww" (OuterVolumeSpecName: "kube-api-access-r7qww") pod "102f803e-192f-41ae-8742-2c9ba8ad7806" (UID: "102f803e-192f-41ae-8742-2c9ba8ad7806"). InnerVolumeSpecName "kube-api-access-r7qww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:47:39 crc kubenswrapper[4697]: I0220 16:47:39.551077 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/102f803e-192f-41ae-8742-2c9ba8ad7806-util" (OuterVolumeSpecName: "util") pod "102f803e-192f-41ae-8742-2c9ba8ad7806" (UID: "102f803e-192f-41ae-8742-2c9ba8ad7806"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:47:39 crc kubenswrapper[4697]: I0220 16:47:39.633122 4697 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/102f803e-192f-41ae-8742-2c9ba8ad7806-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:47:39 crc kubenswrapper[4697]: I0220 16:47:39.633159 4697 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/102f803e-192f-41ae-8742-2c9ba8ad7806-util\") on node \"crc\" DevicePath \"\"" Feb 20 16:47:39 crc kubenswrapper[4697]: I0220 16:47:39.633168 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7qww\" (UniqueName: \"kubernetes.io/projected/102f803e-192f-41ae-8742-2c9ba8ad7806-kube-api-access-r7qww\") on node \"crc\" DevicePath \"\"" Feb 20 16:47:40 crc kubenswrapper[4697]: I0220 16:47:40.065782 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" event={"ID":"102f803e-192f-41ae-8742-2c9ba8ad7806","Type":"ContainerDied","Data":"4c3b9321936d0bee5c5eb8d0fb116ac7d372144804caf3547288761246f405e5"} Feb 20 16:47:40 crc kubenswrapper[4697]: I0220 16:47:40.065833 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c3b9321936d0bee5c5eb8d0fb116ac7d372144804caf3547288761246f405e5" Feb 20 16:47:40 crc kubenswrapper[4697]: I0220 16:47:40.065875 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb" Feb 20 16:47:46 crc kubenswrapper[4697]: I0220 16:47:46.608884 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x"] Feb 20 16:47:46 crc kubenswrapper[4697]: E0220 16:47:46.609649 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="102f803e-192f-41ae-8742-2c9ba8ad7806" containerName="extract" Feb 20 16:47:46 crc kubenswrapper[4697]: I0220 16:47:46.609661 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="102f803e-192f-41ae-8742-2c9ba8ad7806" containerName="extract" Feb 20 16:47:46 crc kubenswrapper[4697]: E0220 16:47:46.609673 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="102f803e-192f-41ae-8742-2c9ba8ad7806" containerName="util" Feb 20 16:47:46 crc kubenswrapper[4697]: I0220 16:47:46.609680 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="102f803e-192f-41ae-8742-2c9ba8ad7806" containerName="util" Feb 20 16:47:46 crc kubenswrapper[4697]: E0220 16:47:46.609693 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="102f803e-192f-41ae-8742-2c9ba8ad7806" containerName="pull" Feb 20 16:47:46 crc kubenswrapper[4697]: I0220 16:47:46.609699 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="102f803e-192f-41ae-8742-2c9ba8ad7806" containerName="pull" Feb 20 16:47:46 crc kubenswrapper[4697]: I0220 16:47:46.609804 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="102f803e-192f-41ae-8742-2c9ba8ad7806" containerName="extract" Feb 20 16:47:46 crc kubenswrapper[4697]: I0220 16:47:46.610186 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x" Feb 20 16:47:46 crc kubenswrapper[4697]: I0220 16:47:46.613729 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-zr6ks" Feb 20 16:47:46 crc kubenswrapper[4697]: I0220 16:47:46.629415 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w9tv\" (UniqueName: \"kubernetes.io/projected/9df0ae9c-f41a-4c92-b62e-ff0f230da65c-kube-api-access-6w9tv\") pod \"openstack-operator-controller-init-64dbc77f9f-pqx8x\" (UID: \"9df0ae9c-f41a-4c92-b62e-ff0f230da65c\") " pod="openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x" Feb 20 16:47:46 crc kubenswrapper[4697]: I0220 16:47:46.640070 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x"] Feb 20 16:47:46 crc kubenswrapper[4697]: I0220 16:47:46.730469 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w9tv\" (UniqueName: \"kubernetes.io/projected/9df0ae9c-f41a-4c92-b62e-ff0f230da65c-kube-api-access-6w9tv\") pod \"openstack-operator-controller-init-64dbc77f9f-pqx8x\" (UID: \"9df0ae9c-f41a-4c92-b62e-ff0f230da65c\") " pod="openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x" Feb 20 16:47:46 crc kubenswrapper[4697]: I0220 16:47:46.751884 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w9tv\" (UniqueName: \"kubernetes.io/projected/9df0ae9c-f41a-4c92-b62e-ff0f230da65c-kube-api-access-6w9tv\") pod \"openstack-operator-controller-init-64dbc77f9f-pqx8x\" (UID: \"9df0ae9c-f41a-4c92-b62e-ff0f230da65c\") " pod="openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x" Feb 20 16:47:46 crc kubenswrapper[4697]: I0220 16:47:46.930168 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x" Feb 20 16:47:47 crc kubenswrapper[4697]: I0220 16:47:47.170534 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x"] Feb 20 16:47:48 crc kubenswrapper[4697]: I0220 16:47:48.120368 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x" event={"ID":"9df0ae9c-f41a-4c92-b62e-ff0f230da65c","Type":"ContainerStarted","Data":"521b268aaabc35eec14c6a544ed27bf535f12ad089837da6e616b0331080f042"} Feb 20 16:47:51 crc kubenswrapper[4697]: I0220 16:47:51.138768 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x" event={"ID":"9df0ae9c-f41a-4c92-b62e-ff0f230da65c","Type":"ContainerStarted","Data":"b0c8418520e6d8da69c00248f005ce9f37fe60bbb93a1d84bfe08c5d6824054a"} Feb 20 16:47:51 crc kubenswrapper[4697]: I0220 16:47:51.139231 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x" Feb 20 16:47:51 crc kubenswrapper[4697]: I0220 16:47:51.176166 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x" podStartSLOduration=1.420559607 podStartE2EDuration="5.176151224s" podCreationTimestamp="2026-02-20 16:47:46 +0000 UTC" firstStartedPulling="2026-02-20 16:47:47.179575364 +0000 UTC m=+974.959620772" lastFinishedPulling="2026-02-20 16:47:50.935166981 +0000 UTC m=+978.715212389" observedRunningTime="2026-02-20 16:47:51.17562566 +0000 UTC m=+978.955671068" watchObservedRunningTime="2026-02-20 16:47:51.176151224 +0000 UTC m=+978.956196622" Feb 20 16:47:56 crc kubenswrapper[4697]: I0220 16:47:56.932401 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-64dbc77f9f-pqx8x" Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.130910 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ncflg"] Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.132165 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.145316 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ncflg"] Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.292864 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e7ef02-f555-4fe4-8107-15e58d45c3cc-utilities\") pod \"certified-operators-ncflg\" (UID: \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\") " pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.292933 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e7ef02-f555-4fe4-8107-15e58d45c3cc-catalog-content\") pod \"certified-operators-ncflg\" (UID: \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\") " pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.292974 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfxnh\" (UniqueName: \"kubernetes.io/projected/75e7ef02-f555-4fe4-8107-15e58d45c3cc-kube-api-access-hfxnh\") pod \"certified-operators-ncflg\" (UID: \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\") " pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.394586 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e7ef02-f555-4fe4-8107-15e58d45c3cc-utilities\") pod \"certified-operators-ncflg\" (UID: \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\") " pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.394650 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e7ef02-f555-4fe4-8107-15e58d45c3cc-catalog-content\") pod \"certified-operators-ncflg\" (UID: \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\") " pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.394669 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfxnh\" (UniqueName: \"kubernetes.io/projected/75e7ef02-f555-4fe4-8107-15e58d45c3cc-kube-api-access-hfxnh\") pod \"certified-operators-ncflg\" (UID: \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\") " pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.395155 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e7ef02-f555-4fe4-8107-15e58d45c3cc-catalog-content\") pod \"certified-operators-ncflg\" (UID: \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\") " pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.395189 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e7ef02-f555-4fe4-8107-15e58d45c3cc-utilities\") pod \"certified-operators-ncflg\" (UID: \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\") " pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.413599 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfxnh\" (UniqueName: \"kubernetes.io/projected/75e7ef02-f555-4fe4-8107-15e58d45c3cc-kube-api-access-hfxnh\") pod \"certified-operators-ncflg\" (UID: \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\") " pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.450679 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:47:58 crc kubenswrapper[4697]: I0220 16:47:58.894286 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ncflg"] Feb 20 16:47:59 crc kubenswrapper[4697]: I0220 16:47:59.199822 4697 generic.go:334] "Generic (PLEG): container finished" podID="75e7ef02-f555-4fe4-8107-15e58d45c3cc" containerID="b8f1a822ad5b479367851b10192d3e328044ecb71f5e2e94a6ec471da7ef85bd" exitCode=0 Feb 20 16:47:59 crc kubenswrapper[4697]: I0220 16:47:59.199862 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncflg" event={"ID":"75e7ef02-f555-4fe4-8107-15e58d45c3cc","Type":"ContainerDied","Data":"b8f1a822ad5b479367851b10192d3e328044ecb71f5e2e94a6ec471da7ef85bd"} Feb 20 16:47:59 crc kubenswrapper[4697]: I0220 16:47:59.199912 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncflg" event={"ID":"75e7ef02-f555-4fe4-8107-15e58d45c3cc","Type":"ContainerStarted","Data":"d9d01f4cccb705026bd05e2f8eb216c8f2f991ca0e671cf94beb0032b24f6871"} Feb 20 16:48:00 crc kubenswrapper[4697]: I0220 16:48:00.216734 4697 generic.go:334] "Generic (PLEG): container finished" podID="75e7ef02-f555-4fe4-8107-15e58d45c3cc" containerID="9aef3e0f8410c4c9e9356c89f60cbedbc03980920fafa85db5d5cbecbfecaa58" exitCode=0 Feb 20 16:48:00 crc kubenswrapper[4697]: I0220 16:48:00.216800 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncflg" event={"ID":"75e7ef02-f555-4fe4-8107-15e58d45c3cc","Type":"ContainerDied","Data":"9aef3e0f8410c4c9e9356c89f60cbedbc03980920fafa85db5d5cbecbfecaa58"} Feb 20 16:48:00 crc kubenswrapper[4697]: I0220 16:48:00.218301 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 16:48:01 crc kubenswrapper[4697]: I0220 16:48:01.225227 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncflg" event={"ID":"75e7ef02-f555-4fe4-8107-15e58d45c3cc","Type":"ContainerStarted","Data":"495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71"} Feb 20 16:48:01 crc kubenswrapper[4697]: I0220 16:48:01.246328 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ncflg" podStartSLOduration=1.843618677 podStartE2EDuration="3.246310034s" podCreationTimestamp="2026-02-20 16:47:58 +0000 UTC" firstStartedPulling="2026-02-20 16:47:59.201074882 +0000 UTC m=+986.981120300" lastFinishedPulling="2026-02-20 16:48:00.603766249 +0000 UTC m=+988.383811657" observedRunningTime="2026-02-20 16:48:01.241203204 +0000 UTC m=+989.021248612" watchObservedRunningTime="2026-02-20 16:48:01.246310034 +0000 UTC m=+989.026355442" Feb 20 16:48:08 crc kubenswrapper[4697]: I0220 16:48:08.451814 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:48:08 crc kubenswrapper[4697]: I0220 16:48:08.452342 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:48:08 crc kubenswrapper[4697]: I0220 16:48:08.500681 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:48:09 crc kubenswrapper[4697]: I0220 16:48:09.337956 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:48:09 crc kubenswrapper[4697]: I0220 16:48:09.380519 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ncflg"] Feb 20 16:48:11 crc kubenswrapper[4697]: I0220 16:48:11.304505 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ncflg" podUID="75e7ef02-f555-4fe4-8107-15e58d45c3cc" containerName="registry-server" containerID="cri-o://495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71" gracePeriod=2 Feb 20 16:48:11 crc kubenswrapper[4697]: I0220 16:48:11.751980 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:48:11 crc kubenswrapper[4697]: I0220 16:48:11.866580 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e7ef02-f555-4fe4-8107-15e58d45c3cc-utilities\") pod \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\" (UID: \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\") " Feb 20 16:48:11 crc kubenswrapper[4697]: I0220 16:48:11.866654 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfxnh\" (UniqueName: \"kubernetes.io/projected/75e7ef02-f555-4fe4-8107-15e58d45c3cc-kube-api-access-hfxnh\") pod \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\" (UID: \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\") " Feb 20 16:48:11 crc kubenswrapper[4697]: I0220 16:48:11.866734 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e7ef02-f555-4fe4-8107-15e58d45c3cc-catalog-content\") pod \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\" (UID: \"75e7ef02-f555-4fe4-8107-15e58d45c3cc\") " Feb 20 16:48:11 crc kubenswrapper[4697]: I0220 16:48:11.867527 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e7ef02-f555-4fe4-8107-15e58d45c3cc-utilities" (OuterVolumeSpecName: "utilities") pod "75e7ef02-f555-4fe4-8107-15e58d45c3cc" (UID: "75e7ef02-f555-4fe4-8107-15e58d45c3cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:48:11 crc kubenswrapper[4697]: I0220 16:48:11.885777 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e7ef02-f555-4fe4-8107-15e58d45c3cc-kube-api-access-hfxnh" (OuterVolumeSpecName: "kube-api-access-hfxnh") pod "75e7ef02-f555-4fe4-8107-15e58d45c3cc" (UID: "75e7ef02-f555-4fe4-8107-15e58d45c3cc"). InnerVolumeSpecName "kube-api-access-hfxnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:48:11 crc kubenswrapper[4697]: I0220 16:48:11.927588 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e7ef02-f555-4fe4-8107-15e58d45c3cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75e7ef02-f555-4fe4-8107-15e58d45c3cc" (UID: "75e7ef02-f555-4fe4-8107-15e58d45c3cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:48:11 crc kubenswrapper[4697]: I0220 16:48:11.968618 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e7ef02-f555-4fe4-8107-15e58d45c3cc-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:48:11 crc kubenswrapper[4697]: I0220 16:48:11.968652 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfxnh\" (UniqueName: \"kubernetes.io/projected/75e7ef02-f555-4fe4-8107-15e58d45c3cc-kube-api-access-hfxnh\") on node \"crc\" DevicePath \"\"" Feb 20 16:48:11 crc kubenswrapper[4697]: I0220 16:48:11.968662 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e7ef02-f555-4fe4-8107-15e58d45c3cc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.311853 4697 generic.go:334] "Generic (PLEG): container finished" podID="75e7ef02-f555-4fe4-8107-15e58d45c3cc" containerID="495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71" exitCode=0 Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.311911 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ncflg" Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.311911 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncflg" event={"ID":"75e7ef02-f555-4fe4-8107-15e58d45c3cc","Type":"ContainerDied","Data":"495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71"} Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.311979 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncflg" event={"ID":"75e7ef02-f555-4fe4-8107-15e58d45c3cc","Type":"ContainerDied","Data":"d9d01f4cccb705026bd05e2f8eb216c8f2f991ca0e671cf94beb0032b24f6871"} Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.312000 4697 scope.go:117] "RemoveContainer" containerID="495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71" Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.332753 4697 scope.go:117] "RemoveContainer" containerID="9aef3e0f8410c4c9e9356c89f60cbedbc03980920fafa85db5d5cbecbfecaa58" Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.347446 4697 scope.go:117] "RemoveContainer" containerID="b8f1a822ad5b479367851b10192d3e328044ecb71f5e2e94a6ec471da7ef85bd" Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.353420 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ncflg"] Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.358820 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ncflg"] Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.369862 4697 scope.go:117] "RemoveContainer" containerID="495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71" Feb 20 16:48:12 crc kubenswrapper[4697]: E0220 16:48:12.370243 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71\": container with ID starting with 495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71 not found: ID does not exist" containerID="495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71" Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.370277 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71"} err="failed to get container status \"495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71\": rpc error: code = NotFound desc = could not find container \"495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71\": container with ID starting with 495b8ad802622a412ff564d57926bda9cdbc1727be17e6f4e1239c3224191c71 not found: ID does not exist" Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.370303 4697 scope.go:117] "RemoveContainer" containerID="9aef3e0f8410c4c9e9356c89f60cbedbc03980920fafa85db5d5cbecbfecaa58" Feb 20 16:48:12 crc kubenswrapper[4697]: E0220 16:48:12.370558 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aef3e0f8410c4c9e9356c89f60cbedbc03980920fafa85db5d5cbecbfecaa58\": container with ID starting with 9aef3e0f8410c4c9e9356c89f60cbedbc03980920fafa85db5d5cbecbfecaa58 not found: ID does not exist" containerID="9aef3e0f8410c4c9e9356c89f60cbedbc03980920fafa85db5d5cbecbfecaa58" Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.370587 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aef3e0f8410c4c9e9356c89f60cbedbc03980920fafa85db5d5cbecbfecaa58"} err="failed to get container status \"9aef3e0f8410c4c9e9356c89f60cbedbc03980920fafa85db5d5cbecbfecaa58\": rpc error: code = NotFound desc = could not find container \"9aef3e0f8410c4c9e9356c89f60cbedbc03980920fafa85db5d5cbecbfecaa58\": container with ID starting with 9aef3e0f8410c4c9e9356c89f60cbedbc03980920fafa85db5d5cbecbfecaa58 not found: ID does not exist" Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.370635 4697 scope.go:117] "RemoveContainer" containerID="b8f1a822ad5b479367851b10192d3e328044ecb71f5e2e94a6ec471da7ef85bd" Feb 20 16:48:12 crc kubenswrapper[4697]: E0220 16:48:12.370871 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8f1a822ad5b479367851b10192d3e328044ecb71f5e2e94a6ec471da7ef85bd\": container with ID starting with b8f1a822ad5b479367851b10192d3e328044ecb71f5e2e94a6ec471da7ef85bd not found: ID does not exist" containerID="b8f1a822ad5b479367851b10192d3e328044ecb71f5e2e94a6ec471da7ef85bd" Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.370898 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8f1a822ad5b479367851b10192d3e328044ecb71f5e2e94a6ec471da7ef85bd"} err="failed to get container status \"b8f1a822ad5b479367851b10192d3e328044ecb71f5e2e94a6ec471da7ef85bd\": rpc error: code = NotFound desc = could not find container \"b8f1a822ad5b479367851b10192d3e328044ecb71f5e2e94a6ec471da7ef85bd\": container with ID starting with b8f1a822ad5b479367851b10192d3e328044ecb71f5e2e94a6ec471da7ef85bd not found: ID does not exist" Feb 20 16:48:12 crc kubenswrapper[4697]: I0220 16:48:12.885285 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e7ef02-f555-4fe4-8107-15e58d45c3cc" path="/var/lib/kubelet/pods/75e7ef02-f555-4fe4-8107-15e58d45c3cc/volumes" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.645478 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr"] Feb 20 16:48:16 crc kubenswrapper[4697]: E0220 16:48:16.646004 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e7ef02-f555-4fe4-8107-15e58d45c3cc" containerName="registry-server" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.646016 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e7ef02-f555-4fe4-8107-15e58d45c3cc" containerName="registry-server" Feb 20 16:48:16 crc kubenswrapper[4697]: E0220 16:48:16.646032 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e7ef02-f555-4fe4-8107-15e58d45c3cc" containerName="extract-content" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.646039 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e7ef02-f555-4fe4-8107-15e58d45c3cc" containerName="extract-content" Feb 20 16:48:16 crc kubenswrapper[4697]: E0220 16:48:16.646049 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e7ef02-f555-4fe4-8107-15e58d45c3cc" containerName="extract-utilities" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.646056 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e7ef02-f555-4fe4-8107-15e58d45c3cc" containerName="extract-utilities" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.646158 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e7ef02-f555-4fe4-8107-15e58d45c3cc" containerName="registry-server" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.646571 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.648125 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-hq827" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.652940 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.653855 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.656423 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-tcfll" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.665988 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.673559 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.674588 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.677346 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-nq9nc" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.692514 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.699233 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.711501 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.712484 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.717197 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qv4z7" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.728010 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.728743 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.730992 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5csqv\" (UniqueName: \"kubernetes.io/projected/fcf00eef-940f-4da3-8359-325f1abb0c6d-kube-api-access-5csqv\") pod \"barbican-operator-controller-manager-868647ff47-kqjzr\" (UID: \"fcf00eef-940f-4da3-8359-325f1abb0c6d\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.737777 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-qlll4" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.745497 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.751309 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.752578 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.757086 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-g6mbp" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.757294 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.775797 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.784352 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.785310 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.791991 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-h8cgm" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.792004 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.795378 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.796418 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.799228 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-n5jwp" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.802334 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.803218 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.805044 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-f8c7k" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.812166 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.832178 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bccc\" (UniqueName: \"kubernetes.io/projected/b0605f71-51c5-49d9-8936-77affb7cf0bf-kube-api-access-4bccc\") pod \"horizon-operator-controller-manager-5b9b8895d5-qjfrm\" (UID: \"b0605f71-51c5-49d9-8936-77affb7cf0bf\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.832226 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5csqv\" (UniqueName: \"kubernetes.io/projected/fcf00eef-940f-4da3-8359-325f1abb0c6d-kube-api-access-5csqv\") pod \"barbican-operator-controller-manager-868647ff47-kqjzr\" (UID: \"fcf00eef-940f-4da3-8359-325f1abb0c6d\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.832252 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq4dj\" (UniqueName: \"kubernetes.io/projected/d233b891-dac3-4565-971b-85141828260d-kube-api-access-fq4dj\") pod \"heat-operator-controller-manager-69f49c598c-4nqmq\" (UID: \"d233b891-dac3-4565-971b-85141828260d\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.832284 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgcb4\" (UniqueName: \"kubernetes.io/projected/0b5f03ab-32bb-48ef-b7d7-1ede5fb51924-kube-api-access-vgcb4\") pod \"cinder-operator-controller-manager-5d946d989d-9zdp9\" (UID: \"0b5f03ab-32bb-48ef-b7d7-1ede5fb51924\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.832304 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzbxt\" (UniqueName: \"kubernetes.io/projected/71f3f3ad-1f6c-4d59-9fc8-b036014c1068-kube-api-access-dzbxt\") pod \"glance-operator-controller-manager-77987464f4-5xkg5\" (UID: \"71f3f3ad-1f6c-4d59-9fc8-b036014c1068\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.832324 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gppz\" (UniqueName: \"kubernetes.io/projected/f56f8133-887c-456a-9cbf-6df7713789b3-kube-api-access-9gppz\") pod \"designate-operator-controller-manager-6d8bf5c495-jxdb5\" (UID: \"f56f8133-887c-456a-9cbf-6df7713789b3\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.843155 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.847457 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.850996 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.851756 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.855080 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-mzxrt" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.863670 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5csqv\" (UniqueName: \"kubernetes.io/projected/fcf00eef-940f-4da3-8359-325f1abb0c6d-kube-api-access-5csqv\") pod \"barbican-operator-controller-manager-868647ff47-kqjzr\" (UID: \"fcf00eef-940f-4da3-8359-325f1abb0c6d\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.892811 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.893487 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.896211 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.903308 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-gfhzg" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.946682 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq4dj\" (UniqueName: \"kubernetes.io/projected/d233b891-dac3-4565-971b-85141828260d-kube-api-access-fq4dj\") pod \"heat-operator-controller-manager-69f49c598c-4nqmq\" (UID: \"d233b891-dac3-4565-971b-85141828260d\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.946789 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgcb4\" (UniqueName: \"kubernetes.io/projected/0b5f03ab-32bb-48ef-b7d7-1ede5fb51924-kube-api-access-vgcb4\") pod \"cinder-operator-controller-manager-5d946d989d-9zdp9\" (UID: \"0b5f03ab-32bb-48ef-b7d7-1ede5fb51924\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.946826 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qfjn\" (UniqueName: \"kubernetes.io/projected/d30d696e-1555-4fc2-9316-c795de608048-kube-api-access-9qfjn\") pod \"infra-operator-controller-manager-79d975b745-j9lmt\" (UID: \"d30d696e-1555-4fc2-9316-c795de608048\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.946856 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzbxt\" (UniqueName: \"kubernetes.io/projected/71f3f3ad-1f6c-4d59-9fc8-b036014c1068-kube-api-access-dzbxt\") pod \"glance-operator-controller-manager-77987464f4-5xkg5\" (UID: \"71f3f3ad-1f6c-4d59-9fc8-b036014c1068\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.946905 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gppz\" (UniqueName: \"kubernetes.io/projected/f56f8133-887c-456a-9cbf-6df7713789b3-kube-api-access-9gppz\") pod \"designate-operator-controller-manager-6d8bf5c495-jxdb5\" (UID: \"f56f8133-887c-456a-9cbf-6df7713789b3\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.946928 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc64q\" (UniqueName: \"kubernetes.io/projected/560f1df6-c03f-42ad-8175-5508f56e1ecc-kube-api-access-bc64q\") pod \"manila-operator-controller-manager-54f6768c69-dknf8\" (UID: \"560f1df6-c03f-42ad-8175-5508f56e1ecc\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.947021 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffzkh\" (UniqueName: \"kubernetes.io/projected/d9926f6e-afca-48ad-8a52-fb7f53ba3dec-kube-api-access-ffzkh\") pod \"ironic-operator-controller-manager-554564d7fc-82v6j\" (UID: \"d9926f6e-afca-48ad-8a52-fb7f53ba3dec\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.947042 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d8bt\" (UniqueName: \"kubernetes.io/projected/4fdb933f-aa86-4b88-9b08-4783ce0f6e0c-kube-api-access-6d8bt\") pod \"keystone-operator-controller-manager-b4d948c87-rw4q5\" (UID: \"4fdb933f-aa86-4b88-9b08-4783ce0f6e0c\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.947096 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert\") pod \"infra-operator-controller-manager-79d975b745-j9lmt\" (UID: \"d30d696e-1555-4fc2-9316-c795de608048\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.947124 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bccc\" (UniqueName: \"kubernetes.io/projected/b0605f71-51c5-49d9-8936-77affb7cf0bf-kube-api-access-4bccc\") pod \"horizon-operator-controller-manager-5b9b8895d5-qjfrm\" (UID: \"b0605f71-51c5-49d9-8936-77affb7cf0bf\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.967699 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.968633 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.969222 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgcb4\" (UniqueName: \"kubernetes.io/projected/0b5f03ab-32bb-48ef-b7d7-1ede5fb51924-kube-api-access-vgcb4\") pod \"cinder-operator-controller-manager-5d946d989d-9zdp9\" (UID: \"0b5f03ab-32bb-48ef-b7d7-1ede5fb51924\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.971183 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.978270 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-9ll7j" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.982095 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq4dj\" (UniqueName: \"kubernetes.io/projected/d233b891-dac3-4565-971b-85141828260d-kube-api-access-fq4dj\") pod \"heat-operator-controller-manager-69f49c598c-4nqmq\" (UID: \"d233b891-dac3-4565-971b-85141828260d\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.991336 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bccc\" (UniqueName: \"kubernetes.io/projected/b0605f71-51c5-49d9-8936-77affb7cf0bf-kube-api-access-4bccc\") pod \"horizon-operator-controller-manager-5b9b8895d5-qjfrm\" (UID: \"b0605f71-51c5-49d9-8936-77affb7cf0bf\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm" Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.993208 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns"] Feb 20 16:48:16 crc kubenswrapper[4697]: I0220 16:48:16.998968 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzbxt\" (UniqueName: \"kubernetes.io/projected/71f3f3ad-1f6c-4d59-9fc8-b036014c1068-kube-api-access-dzbxt\") pod \"glance-operator-controller-manager-77987464f4-5xkg5\" (UID: \"71f3f3ad-1f6c-4d59-9fc8-b036014c1068\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.000885 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.005536 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gppz\" (UniqueName: \"kubernetes.io/projected/f56f8133-887c-456a-9cbf-6df7713789b3-kube-api-access-9gppz\") pod \"designate-operator-controller-manager-6d8bf5c495-jxdb5\" (UID: \"f56f8133-887c-456a-9cbf-6df7713789b3\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.012726 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.017818 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.030812 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.031686 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.035713 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.037101 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.045934 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.051909 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-lbxn7" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.052111 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rxldg" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.052949 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.053668 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc64q\" (UniqueName: \"kubernetes.io/projected/560f1df6-c03f-42ad-8175-5508f56e1ecc-kube-api-access-bc64q\") pod \"manila-operator-controller-manager-54f6768c69-dknf8\" (UID: \"560f1df6-c03f-42ad-8175-5508f56e1ecc\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.053737 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffzkh\" (UniqueName: \"kubernetes.io/projected/d9926f6e-afca-48ad-8a52-fb7f53ba3dec-kube-api-access-ffzkh\") pod \"ironic-operator-controller-manager-554564d7fc-82v6j\" (UID: \"d9926f6e-afca-48ad-8a52-fb7f53ba3dec\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.053757 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d8bt\" (UniqueName: \"kubernetes.io/projected/4fdb933f-aa86-4b88-9b08-4783ce0f6e0c-kube-api-access-6d8bt\") pod \"keystone-operator-controller-manager-b4d948c87-rw4q5\" (UID: \"4fdb933f-aa86-4b88-9b08-4783ce0f6e0c\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.053784 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert\") pod \"infra-operator-controller-manager-79d975b745-j9lmt\" (UID: \"d30d696e-1555-4fc2-9316-c795de608048\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.053836 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftg4h\" (UniqueName: \"kubernetes.io/projected/366fd13a-060b-4572-9541-dbf88a507588-kube-api-access-ftg4h\") pod \"mariadb-operator-controller-manager-6994f66f48-gsmns\" (UID: \"366fd13a-060b-4572-9541-dbf88a507588\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.053860 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qfjn\" (UniqueName: \"kubernetes.io/projected/d30d696e-1555-4fc2-9316-c795de608048-kube-api-access-9qfjn\") pod \"infra-operator-controller-manager-79d975b745-j9lmt\" (UID: \"d30d696e-1555-4fc2-9316-c795de608048\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:17 crc kubenswrapper[4697]: E0220 16:48:17.054297 4697 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 16:48:17 crc kubenswrapper[4697]: E0220 16:48:17.054343 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert podName:d30d696e-1555-4fc2-9316-c795de608048 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:17.554328693 +0000 UTC m=+1005.334374101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert") pod "infra-operator-controller-manager-79d975b745-j9lmt" (UID: "d30d696e-1555-4fc2-9316-c795de608048") : secret "infra-operator-webhook-server-cert" not found Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.056362 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.071900 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.073013 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.078734 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.079251 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jh8v8" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.079526 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.085646 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.091231 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qfjn\" (UniqueName: \"kubernetes.io/projected/d30d696e-1555-4fc2-9316-c795de608048-kube-api-access-9qfjn\") pod \"infra-operator-controller-manager-79d975b745-j9lmt\" (UID: \"d30d696e-1555-4fc2-9316-c795de608048\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.094891 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc64q\" (UniqueName: \"kubernetes.io/projected/560f1df6-c03f-42ad-8175-5508f56e1ecc-kube-api-access-bc64q\") pod \"manila-operator-controller-manager-54f6768c69-dknf8\" (UID: \"560f1df6-c03f-42ad-8175-5508f56e1ecc\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.095499 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.100577 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.101404 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.103073 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fs25x" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.109482 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.110313 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.112158 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-qzc9n" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.112759 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d8bt\" (UniqueName: \"kubernetes.io/projected/4fdb933f-aa86-4b88-9b08-4783ce0f6e0c-kube-api-access-6d8bt\") pod \"keystone-operator-controller-manager-b4d948c87-rw4q5\" (UID: \"4fdb933f-aa86-4b88-9b08-4783ce0f6e0c\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.126329 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.127151 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.128626 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.132286 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-jnjs8" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.136378 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.144934 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.144938 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.156410 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjfq9\" (UniqueName: \"kubernetes.io/projected/8b48098d-ef4c-4cde-beef-a7c34573699b-kube-api-access-kjfq9\") pod \"octavia-operator-controller-manager-69f8888797-7jlc2\" (UID: \"8b48098d-ef4c-4cde-beef-a7c34573699b\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.156517 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftg4h\" (UniqueName: \"kubernetes.io/projected/366fd13a-060b-4572-9541-dbf88a507588-kube-api-access-ftg4h\") pod \"mariadb-operator-controller-manager-6994f66f48-gsmns\" (UID: \"366fd13a-060b-4572-9541-dbf88a507588\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.156545 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptkvg\" (UniqueName: \"kubernetes.io/projected/cae8d6b1-4649-40bb-b710-1197ac78db1b-kube-api-access-ptkvg\") pod \"nova-operator-controller-manager-567668f5cf-4mtzl\" (UID: \"cae8d6b1-4649-40bb-b710-1197ac78db1b\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.156582 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s25hv\" (UniqueName: \"kubernetes.io/projected/21c4fbfe-19b7-4303-b4bf-72dbc90044dd-kube-api-access-s25hv\") pod \"neutron-operator-controller-manager-64ddbf8bb-ct4nk\" (UID: \"21c4fbfe-19b7-4303-b4bf-72dbc90044dd\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.162683 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffzkh\" (UniqueName: \"kubernetes.io/projected/d9926f6e-afca-48ad-8a52-fb7f53ba3dec-kube-api-access-ffzkh\") pod \"ironic-operator-controller-manager-554564d7fc-82v6j\" (UID: \"d9926f6e-afca-48ad-8a52-fb7f53ba3dec\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.164118 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.164930 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.169861 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-7fmsx" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.182652 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.188995 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftg4h\" (UniqueName: \"kubernetes.io/projected/366fd13a-060b-4572-9541-dbf88a507588-kube-api-access-ftg4h\") pod \"mariadb-operator-controller-manager-6994f66f48-gsmns\" (UID: \"366fd13a-060b-4572-9541-dbf88a507588\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.213996 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.243161 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.246482 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-wjwzl"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.247252 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.255035 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-zwlzk" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.263909 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6tzf\" (UniqueName: \"kubernetes.io/projected/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-kube-api-access-f6tzf\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8\" (UID: \"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.263950 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptkvg\" (UniqueName: \"kubernetes.io/projected/cae8d6b1-4649-40bb-b710-1197ac78db1b-kube-api-access-ptkvg\") pod \"nova-operator-controller-manager-567668f5cf-4mtzl\" (UID: \"cae8d6b1-4649-40bb-b710-1197ac78db1b\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.263982 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn9wp\" (UniqueName: \"kubernetes.io/projected/8c8d2c10-e4b6-4d37-977f-7ad685981d2f-kube-api-access-zn9wp\") pod \"placement-operator-controller-manager-8497b45c89-l5qll\" (UID: \"8c8d2c10-e4b6-4d37-977f-7ad685981d2f\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.264003 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s25hv\" (UniqueName: \"kubernetes.io/projected/21c4fbfe-19b7-4303-b4bf-72dbc90044dd-kube-api-access-s25hv\") pod \"neutron-operator-controller-manager-64ddbf8bb-ct4nk\" (UID: \"21c4fbfe-19b7-4303-b4bf-72dbc90044dd\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.264029 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jcch\" (UniqueName: \"kubernetes.io/projected/55a77be1-9486-4b8a-acc6-a4d8532016d3-kube-api-access-2jcch\") pod \"ovn-operator-controller-manager-d44cf6b75-lsgnq\" (UID: \"55a77be1-9486-4b8a-acc6-a4d8532016d3\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.264059 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8\" (UID: \"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.264076 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z27mt\" (UniqueName: \"kubernetes.io/projected/cb67b5ad-353e-4d96-8d94-fc69e4801f64-kube-api-access-z27mt\") pod \"swift-operator-controller-manager-68f46476f-2mxvz\" (UID: \"cb67b5ad-353e-4d96-8d94-fc69e4801f64\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.264098 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rfzx\" (UniqueName: \"kubernetes.io/projected/a675eb01-18af-4776-94e6-64c0b392248b-kube-api-access-8rfzx\") pod \"telemetry-operator-controller-manager-7f45b4ff68-b89r4\" (UID: \"a675eb01-18af-4776-94e6-64c0b392248b\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.264120 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjfq9\" (UniqueName: \"kubernetes.io/projected/8b48098d-ef4c-4cde-beef-a7c34573699b-kube-api-access-kjfq9\") pod \"octavia-operator-controller-manager-69f8888797-7jlc2\" (UID: \"8b48098d-ef4c-4cde-beef-a7c34573699b\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.332861 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s25hv\" (UniqueName: \"kubernetes.io/projected/21c4fbfe-19b7-4303-b4bf-72dbc90044dd-kube-api-access-s25hv\") pod \"neutron-operator-controller-manager-64ddbf8bb-ct4nk\" (UID: \"21c4fbfe-19b7-4303-b4bf-72dbc90044dd\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.335587 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptkvg\" (UniqueName: \"kubernetes.io/projected/cae8d6b1-4649-40bb-b710-1197ac78db1b-kube-api-access-ptkvg\") pod \"nova-operator-controller-manager-567668f5cf-4mtzl\" (UID: \"cae8d6b1-4649-40bb-b710-1197ac78db1b\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.336184 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjfq9\" (UniqueName: \"kubernetes.io/projected/8b48098d-ef4c-4cde-beef-a7c34573699b-kube-api-access-kjfq9\") pod \"octavia-operator-controller-manager-69f8888797-7jlc2\" (UID: \"8b48098d-ef4c-4cde-beef-a7c34573699b\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.362419 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-wjwzl"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.366094 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8\" (UID: \"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.366138 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z27mt\" (UniqueName: \"kubernetes.io/projected/cb67b5ad-353e-4d96-8d94-fc69e4801f64-kube-api-access-z27mt\") pod \"swift-operator-controller-manager-68f46476f-2mxvz\" (UID: \"cb67b5ad-353e-4d96-8d94-fc69e4801f64\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.366162 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rfzx\" (UniqueName: \"kubernetes.io/projected/a675eb01-18af-4776-94e6-64c0b392248b-kube-api-access-8rfzx\") pod \"telemetry-operator-controller-manager-7f45b4ff68-b89r4\" (UID: \"a675eb01-18af-4776-94e6-64c0b392248b\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.366229 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6tzf\" (UniqueName: \"kubernetes.io/projected/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-kube-api-access-f6tzf\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8\" (UID: \"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.366256 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lhmm\" (UniqueName: \"kubernetes.io/projected/6af9a0a9-0546-4a59-bdca-1a0609421010-kube-api-access-6lhmm\") pod \"test-operator-controller-manager-7866795846-wjwzl\" (UID: \"6af9a0a9-0546-4a59-bdca-1a0609421010\") " pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.366303 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn9wp\" (UniqueName: \"kubernetes.io/projected/8c8d2c10-e4b6-4d37-977f-7ad685981d2f-kube-api-access-zn9wp\") pod \"placement-operator-controller-manager-8497b45c89-l5qll\" (UID: \"8c8d2c10-e4b6-4d37-977f-7ad685981d2f\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll" Feb 20 16:48:17 crc kubenswrapper[4697]: E0220 16:48:17.366321 4697 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 16:48:17 crc kubenswrapper[4697]: E0220 16:48:17.366405 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert podName:3f3b7ed7-e806-4fa9-ac88-381c0b4bd237 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:17.866384292 +0000 UTC m=+1005.646429810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" (UID: "3f3b7ed7-e806-4fa9-ac88-381c0b4bd237") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.366328 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jcch\" (UniqueName: \"kubernetes.io/projected/55a77be1-9486-4b8a-acc6-a4d8532016d3-kube-api-access-2jcch\") pod \"ovn-operator-controller-manager-d44cf6b75-lsgnq\" (UID: \"55a77be1-9486-4b8a-acc6-a4d8532016d3\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.384345 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.402995 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z27mt\" (UniqueName: \"kubernetes.io/projected/cb67b5ad-353e-4d96-8d94-fc69e4801f64-kube-api-access-z27mt\") pod \"swift-operator-controller-manager-68f46476f-2mxvz\" (UID: \"cb67b5ad-353e-4d96-8d94-fc69e4801f64\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.423641 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.424524 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.431932 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rfzx\" (UniqueName: \"kubernetes.io/projected/a675eb01-18af-4776-94e6-64c0b392248b-kube-api-access-8rfzx\") pod \"telemetry-operator-controller-manager-7f45b4ff68-b89r4\" (UID: \"a675eb01-18af-4776-94e6-64c0b392248b\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.432175 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-c5s7x" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.432263 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.433081 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jcch\" (UniqueName: \"kubernetes.io/projected/55a77be1-9486-4b8a-acc6-a4d8532016d3-kube-api-access-2jcch\") pod \"ovn-operator-controller-manager-d44cf6b75-lsgnq\" (UID: \"55a77be1-9486-4b8a-acc6-a4d8532016d3\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.434100 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6tzf\" (UniqueName: \"kubernetes.io/projected/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-kube-api-access-f6tzf\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8\" (UID: \"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.434520 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn9wp\" (UniqueName: \"kubernetes.io/projected/8c8d2c10-e4b6-4d37-977f-7ad685981d2f-kube-api-access-zn9wp\") pod \"placement-operator-controller-manager-8497b45c89-l5qll\" (UID: \"8c8d2c10-e4b6-4d37-977f-7ad685981d2f\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.446374 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.497673 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lhmm\" (UniqueName: \"kubernetes.io/projected/6af9a0a9-0546-4a59-bdca-1a0609421010-kube-api-access-6lhmm\") pod \"test-operator-controller-manager-7866795846-wjwzl\" (UID: \"6af9a0a9-0546-4a59-bdca-1a0609421010\") " pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.498010 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f8xp\" (UniqueName: \"kubernetes.io/projected/d8c72591-eb8d-4553-867d-60482d51c4db-kube-api-access-7f8xp\") pod \"watcher-operator-controller-manager-9d9d9f9cd-t7nvz\" (UID: \"d8c72591-eb8d-4553-867d-60482d51c4db\") " pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.499403 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.518627 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.533092 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.538389 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.557959 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lhmm\" (UniqueName: \"kubernetes.io/projected/6af9a0a9-0546-4a59-bdca-1a0609421010-kube-api-access-6lhmm\") pod \"test-operator-controller-manager-7866795846-wjwzl\" (UID: \"6af9a0a9-0546-4a59-bdca-1a0609421010\") " pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.563470 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.564352 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.569017 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.569198 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.569309 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-tn462" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.577806 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.577673 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.598078 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.600675 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.600739 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert\") pod \"infra-operator-controller-manager-79d975b745-j9lmt\" (UID: \"d30d696e-1555-4fc2-9316-c795de608048\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.600759 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.600846 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nx2l\" (UniqueName: \"kubernetes.io/projected/33e3fc43-dbfe-4fff-bac3-6021dfa84982-kube-api-access-2nx2l\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.600886 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f8xp\" (UniqueName: \"kubernetes.io/projected/d8c72591-eb8d-4553-867d-60482d51c4db-kube-api-access-7f8xp\") pod \"watcher-operator-controller-manager-9d9d9f9cd-t7nvz\" (UID: \"d8c72591-eb8d-4553-867d-60482d51c4db\") " pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" Feb 20 16:48:17 crc kubenswrapper[4697]: E0220 16:48:17.601232 4697 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 16:48:17 crc kubenswrapper[4697]: E0220 16:48:17.601268 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert podName:d30d696e-1555-4fc2-9316-c795de608048 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:18.601256889 +0000 UTC m=+1006.381302297 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert") pod "infra-operator-controller-manager-79d975b745-j9lmt" (UID: "d30d696e-1555-4fc2-9316-c795de608048") : secret "infra-operator-webhook-server-cert" not found Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.622487 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.623954 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f8xp\" (UniqueName: \"kubernetes.io/projected/d8c72591-eb8d-4553-867d-60482d51c4db-kube-api-access-7f8xp\") pod \"watcher-operator-controller-manager-9d9d9f9cd-t7nvz\" (UID: \"d8c72591-eb8d-4553-867d-60482d51c4db\") " pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.634730 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.635687 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.642427 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6pttt" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.649591 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms"] Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.704169 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6ffv\" (UniqueName: \"kubernetes.io/projected/8546e4ea-d7f0-4244-8496-e962809c4203-kube-api-access-x6ffv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xnpms\" (UID: \"8546e4ea-d7f0-4244-8496-e962809c4203\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.704230 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.704288 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nx2l\" (UniqueName: \"kubernetes.io/projected/33e3fc43-dbfe-4fff-bac3-6021dfa84982-kube-api-access-2nx2l\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.704329 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:17 crc kubenswrapper[4697]: E0220 16:48:17.704475 4697 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 16:48:17 crc kubenswrapper[4697]: E0220 16:48:17.704522 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs podName:33e3fc43-dbfe-4fff-bac3-6021dfa84982 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:18.204507111 +0000 UTC m=+1005.984552519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs") pod "openstack-operator-controller-manager-b45cc898b-82j7k" (UID: "33e3fc43-dbfe-4fff-bac3-6021dfa84982") : secret "metrics-server-cert" not found Feb 20 16:48:17 crc kubenswrapper[4697]: E0220 16:48:17.704735 4697 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 16:48:17 crc kubenswrapper[4697]: E0220 16:48:17.704757 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs podName:33e3fc43-dbfe-4fff-bac3-6021dfa84982 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:18.204750227 +0000 UTC m=+1005.984795635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs") pod "openstack-operator-controller-manager-b45cc898b-82j7k" (UID: "33e3fc43-dbfe-4fff-bac3-6021dfa84982") : secret "webhook-server-cert" not found Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.738787 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nx2l\" (UniqueName: \"kubernetes.io/projected/33e3fc43-dbfe-4fff-bac3-6021dfa84982-kube-api-access-2nx2l\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.761360 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.807046 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6ffv\" (UniqueName: \"kubernetes.io/projected/8546e4ea-d7f0-4244-8496-e962809c4203-kube-api-access-x6ffv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xnpms\" (UID: \"8546e4ea-d7f0-4244-8496-e962809c4203\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.828397 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6ffv\" (UniqueName: \"kubernetes.io/projected/8546e4ea-d7f0-4244-8496-e962809c4203-kube-api-access-x6ffv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xnpms\" (UID: \"8546e4ea-d7f0-4244-8496-e962809c4203\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms" Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.908177 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8\" (UID: \"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:17 crc kubenswrapper[4697]: E0220 16:48:17.909063 4697 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 16:48:17 crc kubenswrapper[4697]: E0220 16:48:17.909155 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert podName:3f3b7ed7-e806-4fa9-ac88-381c0b4bd237 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:18.90913061 +0000 UTC m=+1006.689176098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" (UID: "3f3b7ed7-e806-4fa9-ac88-381c0b4bd237") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 16:48:17 crc kubenswrapper[4697]: I0220 16:48:17.965118 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms" Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.213939 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.214037 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.214162 4697 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.214210 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs podName:33e3fc43-dbfe-4fff-bac3-6021dfa84982 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:19.214194141 +0000 UTC m=+1006.994239539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs") pod "openstack-operator-controller-manager-b45cc898b-82j7k" (UID: "33e3fc43-dbfe-4fff-bac3-6021dfa84982") : secret "metrics-server-cert" not found Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.214731 4697 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.214846 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs podName:33e3fc43-dbfe-4fff-bac3-6021dfa84982 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:19.214809237 +0000 UTC m=+1006.994854645 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs") pod "openstack-operator-controller-manager-b45cc898b-82j7k" (UID: "33e3fc43-dbfe-4fff-bac3-6021dfa84982") : secret "webhook-server-cert" not found Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.248870 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9"] Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.264369 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr"] Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.295340 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5"] Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.306869 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5"] Feb 20 16:48:18 crc kubenswrapper[4697]: W0220 16:48:18.316026 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71f3f3ad_1f6c_4d59_9fc8_b036014c1068.slice/crio-0d827b1fe965d407ae550f4d196a7ac12a21a754b71cccf9f0b970d32c4acc7d WatchSource:0}: Error finding container 0d827b1fe965d407ae550f4d196a7ac12a21a754b71cccf9f0b970d32c4acc7d: Status 404 returned error can't find the container with id 0d827b1fe965d407ae550f4d196a7ac12a21a754b71cccf9f0b970d32c4acc7d Feb 20 16:48:18 crc kubenswrapper[4697]: W0220 16:48:18.327511 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf56f8133_887c_456a_9cbf_6df7713789b3.slice/crio-c3c4572afb3317fa3051f9e6d1eba6739cb675589c4d4ef44814ab0e3c1d1e80 WatchSource:0}: Error finding container c3c4572afb3317fa3051f9e6d1eba6739cb675589c4d4ef44814ab0e3c1d1e80: Status 404 returned error can't find the container with id c3c4572afb3317fa3051f9e6d1eba6739cb675589c4d4ef44814ab0e3c1d1e80 Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.338876 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5"] Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.350999 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns"] Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.354489 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5" event={"ID":"f56f8133-887c-456a-9cbf-6df7713789b3","Type":"ContainerStarted","Data":"c3c4572afb3317fa3051f9e6d1eba6739cb675589c4d4ef44814ab0e3c1d1e80"} Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.356450 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq"] Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.356842 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9" event={"ID":"0b5f03ab-32bb-48ef-b7d7-1ede5fb51924","Type":"ContainerStarted","Data":"4a04b8fdf79d991ebc017e2711ce4952f28c1437a1933533a9d9ba304cea7333"} Feb 20 16:48:18 crc kubenswrapper[4697]: W0220 16:48:18.358306 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdb933f_aa86_4b88_9b08_4783ce0f6e0c.slice/crio-0c34f37b37a16ac9cc0705f11d061cbe06fc7d65b25495acea369d5e80ee2a5d WatchSource:0}: Error finding container 0c34f37b37a16ac9cc0705f11d061cbe06fc7d65b25495acea369d5e80ee2a5d: Status 404 returned error can't find the container with id 0c34f37b37a16ac9cc0705f11d061cbe06fc7d65b25495acea369d5e80ee2a5d Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.358544 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr" event={"ID":"fcf00eef-940f-4da3-8359-325f1abb0c6d","Type":"ContainerStarted","Data":"8af6327e6260d02f403269768d2e8286c13bb804b3f9ae3d748df402dd49b340"} Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.360207 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5" event={"ID":"71f3f3ad-1f6c-4d59-9fc8-b036014c1068","Type":"ContainerStarted","Data":"0d827b1fe965d407ae550f4d196a7ac12a21a754b71cccf9f0b970d32c4acc7d"} Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.362327 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8"] Feb 20 16:48:18 crc kubenswrapper[4697]: W0220 16:48:18.362756 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd233b891_dac3_4565_971b_85141828260d.slice/crio-3e5593b8378c6412f992d02e7bda5cb664e4a243643599f270876cea729ad419 WatchSource:0}: Error finding container 3e5593b8378c6412f992d02e7bda5cb664e4a243643599f270876cea729ad419: Status 404 returned error can't find the container with id 3e5593b8378c6412f992d02e7bda5cb664e4a243643599f270876cea729ad419 Feb 20 16:48:18 crc kubenswrapper[4697]: W0220 16:48:18.367907 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod366fd13a_060b_4572_9541_dbf88a507588.slice/crio-8f5ef9beb21f5dc20574f7d72b6708b92174a8044d353c24f48be1d4fe6c5f0e WatchSource:0}: Error finding container 8f5ef9beb21f5dc20574f7d72b6708b92174a8044d353c24f48be1d4fe6c5f0e: Status 404 returned error can't find the container with id 8f5ef9beb21f5dc20574f7d72b6708b92174a8044d353c24f48be1d4fe6c5f0e Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.376402 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm"] Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.516532 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk"] Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.525554 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq"] Feb 20 16:48:18 crc kubenswrapper[4697]: W0220 16:48:18.528475 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c8d2c10_e4b6_4d37_977f_7ad685981d2f.slice/crio-3930c33ce80eee6eeb8f5abc2007cea9186d01b544470b5f0e0299e26042af99 WatchSource:0}: Error finding container 3930c33ce80eee6eeb8f5abc2007cea9186d01b544470b5f0e0299e26042af99: Status 404 returned error can't find the container with id 3930c33ce80eee6eeb8f5abc2007cea9186d01b544470b5f0e0299e26042af99 Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.544935 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll"] Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.551750 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2jcch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-lsgnq_openstack-operators(55a77be1-9486-4b8a-acc6-a4d8532016d3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.553222 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl"] Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.553274 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" podUID="55a77be1-9486-4b8a-acc6-a4d8532016d3" Feb 20 16:48:18 crc kubenswrapper[4697]: W0220 16:48:18.556132 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcae8d6b1_4649_40bb_b710_1197ac78db1b.slice/crio-981e4250cd91cdce3b20d70b06a99b0cc2fbd1b4b9750aa73ef0e52b41eb6aac WatchSource:0}: Error finding container 981e4250cd91cdce3b20d70b06a99b0cc2fbd1b4b9750aa73ef0e52b41eb6aac: Status 404 returned error can't find the container with id 981e4250cd91cdce3b20d70b06a99b0cc2fbd1b4b9750aa73ef0e52b41eb6aac Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.559391 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ptkvg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-4mtzl_openstack-operators(cae8d6b1-4649-40bb-b710-1197ac78db1b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.561012 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" podUID="cae8d6b1-4649-40bb-b710-1197ac78db1b" Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.626163 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert\") pod \"infra-operator-controller-manager-79d975b745-j9lmt\" (UID: \"d30d696e-1555-4fc2-9316-c795de608048\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.626342 4697 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.626417 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert podName:d30d696e-1555-4fc2-9316-c795de608048 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:20.626396584 +0000 UTC m=+1008.406441992 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert") pod "infra-operator-controller-manager-79d975b745-j9lmt" (UID: "d30d696e-1555-4fc2-9316-c795de608048") : secret "infra-operator-webhook-server-cert" not found Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.672083 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2"] Feb 20 16:48:18 crc kubenswrapper[4697]: W0220 16:48:18.678935 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b48098d_ef4c_4cde_beef_a7c34573699b.slice/crio-57dd40cd63ef762926e26e0b327be4d91079fcc9eb0f1406523966826e907242 WatchSource:0}: Error finding container 57dd40cd63ef762926e26e0b327be4d91079fcc9eb0f1406523966826e907242: Status 404 returned error can't find the container with id 57dd40cd63ef762926e26e0b327be4d91079fcc9eb0f1406523966826e907242 Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.680545 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4"] Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.680977 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kjfq9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-7jlc2_openstack-operators(8b48098d-ef4c-4cde-beef-a7c34573699b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.682301 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" podUID="8b48098d-ef4c-4cde-beef-a7c34573699b" Feb 20 16:48:18 crc kubenswrapper[4697]: W0220 16:48:18.683338 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda675eb01_18af_4776_94e6_64c0b392248b.slice/crio-208ed62972a3769d04e99161b43719cf0ace1b6f1872b856abd7b035eedf6ca5 WatchSource:0}: Error finding container 208ed62972a3769d04e99161b43719cf0ace1b6f1872b856abd7b035eedf6ca5: Status 404 returned error can't find the container with id 208ed62972a3769d04e99161b43719cf0ace1b6f1872b856abd7b035eedf6ca5 Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.686971 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8rfzx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-b89r4_openstack-operators(a675eb01-18af-4776-94e6-64c0b392248b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.688516 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j"] Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.688574 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" podUID="a675eb01-18af-4776-94e6-64c0b392248b" Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.693802 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms"] Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.699112 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz"] Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.704007 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz"] Feb 20 16:48:18 crc kubenswrapper[4697]: W0220 16:48:18.706147 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb67b5ad_353e_4d96_8d94_fc69e4801f64.slice/crio-eab3a1988dc9a55ca656c15313becacebcedc50c56d12602cd318a9af2c99193 WatchSource:0}: Error finding container eab3a1988dc9a55ca656c15313becacebcedc50c56d12602cd318a9af2c99193: Status 404 returned error can't find the container with id eab3a1988dc9a55ca656c15313becacebcedc50c56d12602cd318a9af2c99193 Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.707231 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-wjwzl"] Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.708646 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z27mt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-2mxvz_openstack-operators(cb67b5ad-353e-4d96-8d94-fc69e4801f64): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.709982 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" podUID="cb67b5ad-353e-4d96-8d94-fc69e4801f64" Feb 20 16:48:18 crc kubenswrapper[4697]: W0220 16:48:18.710840 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8546e4ea_d7f0_4244_8496_e962809c4203.slice/crio-0bf2217446441977fc915b3a4b3c94c3dbe990db834de9f055e01147a9b116c1 WatchSource:0}: Error finding container 0bf2217446441977fc915b3a4b3c94c3dbe990db834de9f055e01147a9b116c1: Status 404 returned error can't find the container with id 0bf2217446441977fc915b3a4b3c94c3dbe990db834de9f055e01147a9b116c1 Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.711132 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.38:5001/openstack-k8s-operators/watcher-operator:eaf82eeed7c641cca4b0e467ff9bfd7468ff8986,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7f8xp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-9d9d9f9cd-t7nvz_openstack-operators(d8c72591-eb8d-4553-867d-60482d51c4db): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.712402 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" podUID="d8c72591-eb8d-4553-867d-60482d51c4db" Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.713675 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x6ffv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xnpms_openstack-operators(8546e4ea-d7f0-4244-8496-e962809c4203): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.714814 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms" podUID="8546e4ea-d7f0-4244-8496-e962809c4203" Feb 20 16:48:18 crc kubenswrapper[4697]: W0220 16:48:18.715558 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6af9a0a9_0546_4a59_bdca_1a0609421010.slice/crio-6a692f492c43686d373bbd67d1d5667f569a2547f36930e8437796f8265df2ab WatchSource:0}: Error finding container 6a692f492c43686d373bbd67d1d5667f569a2547f36930e8437796f8265df2ab: Status 404 returned error can't find the container with id 6a692f492c43686d373bbd67d1d5667f569a2547f36930e8437796f8265df2ab Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.718782 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6lhmm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-wjwzl_openstack-operators(6af9a0a9-0546-4a59-bdca-1a0609421010): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.720000 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" podUID="6af9a0a9-0546-4a59-bdca-1a0609421010" Feb 20 16:48:18 crc kubenswrapper[4697]: I0220 16:48:18.933291 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8\" (UID: \"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.934426 4697 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 16:48:18 crc kubenswrapper[4697]: E0220 16:48:18.934500 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert podName:3f3b7ed7-e806-4fa9-ac88-381c0b4bd237 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:20.934485101 +0000 UTC m=+1008.714530509 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" (UID: "3f3b7ed7-e806-4fa9-ac88-381c0b4bd237") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.237252 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.237340 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:19 crc kubenswrapper[4697]: E0220 16:48:19.237503 4697 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 16:48:19 crc kubenswrapper[4697]: E0220 16:48:19.237600 4697 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 16:48:19 crc kubenswrapper[4697]: E0220 16:48:19.237651 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs podName:33e3fc43-dbfe-4fff-bac3-6021dfa84982 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:21.237634993 +0000 UTC m=+1009.017680401 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs") pod "openstack-operator-controller-manager-b45cc898b-82j7k" (UID: "33e3fc43-dbfe-4fff-bac3-6021dfa84982") : secret "webhook-server-cert" not found Feb 20 16:48:19 crc kubenswrapper[4697]: E0220 16:48:19.237670 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs podName:33e3fc43-dbfe-4fff-bac3-6021dfa84982 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:21.237664704 +0000 UTC m=+1009.017710112 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs") pod "openstack-operator-controller-manager-b45cc898b-82j7k" (UID: "33e3fc43-dbfe-4fff-bac3-6021dfa84982") : secret "metrics-server-cert" not found Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.372060 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" event={"ID":"cae8d6b1-4649-40bb-b710-1197ac78db1b","Type":"ContainerStarted","Data":"981e4250cd91cdce3b20d70b06a99b0cc2fbd1b4b9750aa73ef0e52b41eb6aac"} Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.374063 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" event={"ID":"d8c72591-eb8d-4553-867d-60482d51c4db","Type":"ContainerStarted","Data":"7ea1f79c3b2bae942aba3bce394a7b3d4e5689d8f05298259af3bbd1191357ad"} Feb 20 16:48:19 crc kubenswrapper[4697]: E0220 16:48:19.375241 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" podUID="cae8d6b1-4649-40bb-b710-1197ac78db1b" Feb 20 16:48:19 crc kubenswrapper[4697]: E0220 16:48:19.375552 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.38:5001/openstack-k8s-operators/watcher-operator:eaf82eeed7c641cca4b0e467ff9bfd7468ff8986\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" podUID="d8c72591-eb8d-4553-867d-60482d51c4db" Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.376134 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" event={"ID":"55a77be1-9486-4b8a-acc6-a4d8532016d3","Type":"ContainerStarted","Data":"3408425cfabdc4a777f88f9f9b6e91ab0709249946516f17cd2367075ab9211e"} Feb 20 16:48:19 crc kubenswrapper[4697]: E0220 16:48:19.377719 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" podUID="55a77be1-9486-4b8a-acc6-a4d8532016d3" Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.378807 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" event={"ID":"6af9a0a9-0546-4a59-bdca-1a0609421010","Type":"ContainerStarted","Data":"6a692f492c43686d373bbd67d1d5667f569a2547f36930e8437796f8265df2ab"} Feb 20 16:48:19 crc kubenswrapper[4697]: E0220 16:48:19.380073 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" podUID="6af9a0a9-0546-4a59-bdca-1a0609421010" Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.383449 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms" event={"ID":"8546e4ea-d7f0-4244-8496-e962809c4203","Type":"ContainerStarted","Data":"0bf2217446441977fc915b3a4b3c94c3dbe990db834de9f055e01147a9b116c1"} Feb 20 16:48:19 crc kubenswrapper[4697]: E0220 16:48:19.392211 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms" podUID="8546e4ea-d7f0-4244-8496-e962809c4203" Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.394098 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm" event={"ID":"b0605f71-51c5-49d9-8936-77affb7cf0bf","Type":"ContainerStarted","Data":"45ffdaaee4ce59d74f7e9345eeaa4a088528d94ef5c28ca2eb60ecc81534dc1c"} Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.396341 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" event={"ID":"cb67b5ad-353e-4d96-8d94-fc69e4801f64","Type":"ContainerStarted","Data":"eab3a1988dc9a55ca656c15313becacebcedc50c56d12602cd318a9af2c99193"} Feb 20 16:48:19 crc kubenswrapper[4697]: E0220 16:48:19.398921 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" podUID="cb67b5ad-353e-4d96-8d94-fc69e4801f64" Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.399759 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" event={"ID":"8b48098d-ef4c-4cde-beef-a7c34573699b","Type":"ContainerStarted","Data":"57dd40cd63ef762926e26e0b327be4d91079fcc9eb0f1406523966826e907242"} Feb 20 16:48:19 crc kubenswrapper[4697]: E0220 16:48:19.402974 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" podUID="8b48098d-ef4c-4cde-beef-a7c34573699b" Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.403688 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j" event={"ID":"d9926f6e-afca-48ad-8a52-fb7f53ba3dec","Type":"ContainerStarted","Data":"306289081b1c8ee8001b5f9e6ebd7e15cbcb403d1bb938ce9aa5f481dbd54915"} Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.409394 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8" event={"ID":"560f1df6-c03f-42ad-8175-5508f56e1ecc","Type":"ContainerStarted","Data":"c4fe087dae10c202a365c5f70a2ee8492713162317257ebbc98d7a4f10dc5090"} Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.411849 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5" event={"ID":"4fdb933f-aa86-4b88-9b08-4783ce0f6e0c","Type":"ContainerStarted","Data":"0c34f37b37a16ac9cc0705f11d061cbe06fc7d65b25495acea369d5e80ee2a5d"} Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.412896 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq" event={"ID":"d233b891-dac3-4565-971b-85141828260d","Type":"ContainerStarted","Data":"3e5593b8378c6412f992d02e7bda5cb664e4a243643599f270876cea729ad419"} Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.427110 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" event={"ID":"a675eb01-18af-4776-94e6-64c0b392248b","Type":"ContainerStarted","Data":"208ed62972a3769d04e99161b43719cf0ace1b6f1872b856abd7b035eedf6ca5"} Feb 20 16:48:19 crc kubenswrapper[4697]: E0220 16:48:19.428486 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" podUID="a675eb01-18af-4776-94e6-64c0b392248b" Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.429246 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll" event={"ID":"8c8d2c10-e4b6-4d37-977f-7ad685981d2f","Type":"ContainerStarted","Data":"3930c33ce80eee6eeb8f5abc2007cea9186d01b544470b5f0e0299e26042af99"} Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.434548 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk" event={"ID":"21c4fbfe-19b7-4303-b4bf-72dbc90044dd","Type":"ContainerStarted","Data":"e397a78dd79e4f38f0eabfc1a5f305b882d8f650b21da45bfd744b02401c2586"} Feb 20 16:48:19 crc kubenswrapper[4697]: I0220 16:48:19.435775 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns" event={"ID":"366fd13a-060b-4572-9541-dbf88a507588","Type":"ContainerStarted","Data":"8f5ef9beb21f5dc20574f7d72b6708b92174a8044d353c24f48be1d4fe6c5f0e"} Feb 20 16:48:20 crc kubenswrapper[4697]: E0220 16:48:20.471960 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.38:5001/openstack-k8s-operators/watcher-operator:eaf82eeed7c641cca4b0e467ff9bfd7468ff8986\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" podUID="d8c72591-eb8d-4553-867d-60482d51c4db" Feb 20 16:48:20 crc kubenswrapper[4697]: E0220 16:48:20.472516 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" podUID="a675eb01-18af-4776-94e6-64c0b392248b" Feb 20 16:48:20 crc kubenswrapper[4697]: E0220 16:48:20.472554 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms" podUID="8546e4ea-d7f0-4244-8496-e962809c4203" Feb 20 16:48:20 crc kubenswrapper[4697]: E0220 16:48:20.472588 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" podUID="8b48098d-ef4c-4cde-beef-a7c34573699b" Feb 20 16:48:20 crc kubenswrapper[4697]: E0220 16:48:20.472630 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" podUID="55a77be1-9486-4b8a-acc6-a4d8532016d3" Feb 20 16:48:20 crc kubenswrapper[4697]: E0220 16:48:20.472670 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" podUID="cb67b5ad-353e-4d96-8d94-fc69e4801f64" Feb 20 16:48:20 crc kubenswrapper[4697]: E0220 16:48:20.472704 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" podUID="6af9a0a9-0546-4a59-bdca-1a0609421010" Feb 20 16:48:20 crc kubenswrapper[4697]: E0220 16:48:20.472735 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" podUID="cae8d6b1-4649-40bb-b710-1197ac78db1b" Feb 20 16:48:20 crc kubenswrapper[4697]: I0220 16:48:20.666190 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert\") pod \"infra-operator-controller-manager-79d975b745-j9lmt\" (UID: \"d30d696e-1555-4fc2-9316-c795de608048\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:20 crc kubenswrapper[4697]: E0220 16:48:20.666361 4697 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 16:48:20 crc kubenswrapper[4697]: E0220 16:48:20.666408 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert podName:d30d696e-1555-4fc2-9316-c795de608048 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:24.666394054 +0000 UTC m=+1012.446439462 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert") pod "infra-operator-controller-manager-79d975b745-j9lmt" (UID: "d30d696e-1555-4fc2-9316-c795de608048") : secret "infra-operator-webhook-server-cert" not found Feb 20 16:48:20 crc kubenswrapper[4697]: I0220 16:48:20.969752 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8\" (UID: \"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:20 crc kubenswrapper[4697]: E0220 16:48:20.969981 4697 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 16:48:20 crc kubenswrapper[4697]: E0220 16:48:20.970111 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert podName:3f3b7ed7-e806-4fa9-ac88-381c0b4bd237 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:24.970064369 +0000 UTC m=+1012.750109777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" (UID: "3f3b7ed7-e806-4fa9-ac88-381c0b4bd237") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 16:48:21 crc kubenswrapper[4697]: I0220 16:48:21.274023 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:21 crc kubenswrapper[4697]: I0220 16:48:21.274111 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:21 crc kubenswrapper[4697]: E0220 16:48:21.274179 4697 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 16:48:21 crc kubenswrapper[4697]: E0220 16:48:21.274263 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs podName:33e3fc43-dbfe-4fff-bac3-6021dfa84982 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:25.274241497 +0000 UTC m=+1013.054286905 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs") pod "openstack-operator-controller-manager-b45cc898b-82j7k" (UID: "33e3fc43-dbfe-4fff-bac3-6021dfa84982") : secret "metrics-server-cert" not found Feb 20 16:48:21 crc kubenswrapper[4697]: E0220 16:48:21.274280 4697 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 16:48:21 crc kubenswrapper[4697]: E0220 16:48:21.274354 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs podName:33e3fc43-dbfe-4fff-bac3-6021dfa84982 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:25.274334729 +0000 UTC m=+1013.054380137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs") pod "openstack-operator-controller-manager-b45cc898b-82j7k" (UID: "33e3fc43-dbfe-4fff-bac3-6021dfa84982") : secret "webhook-server-cert" not found Feb 20 16:48:24 crc kubenswrapper[4697]: I0220 16:48:24.725651 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert\") pod \"infra-operator-controller-manager-79d975b745-j9lmt\" (UID: \"d30d696e-1555-4fc2-9316-c795de608048\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:24 crc kubenswrapper[4697]: E0220 16:48:24.725794 4697 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 16:48:24 crc kubenswrapper[4697]: E0220 16:48:24.726009 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert podName:d30d696e-1555-4fc2-9316-c795de608048 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:32.725991745 +0000 UTC m=+1020.506037153 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert") pod "infra-operator-controller-manager-79d975b745-j9lmt" (UID: "d30d696e-1555-4fc2-9316-c795de608048") : secret "infra-operator-webhook-server-cert" not found Feb 20 16:48:25 crc kubenswrapper[4697]: I0220 16:48:25.029964 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8\" (UID: \"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:25 crc kubenswrapper[4697]: E0220 16:48:25.030139 4697 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 16:48:25 crc kubenswrapper[4697]: E0220 16:48:25.030211 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert podName:3f3b7ed7-e806-4fa9-ac88-381c0b4bd237 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:33.030189943 +0000 UTC m=+1020.810235361 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" (UID: "3f3b7ed7-e806-4fa9-ac88-381c0b4bd237") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 16:48:25 crc kubenswrapper[4697]: I0220 16:48:25.339694 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:25 crc kubenswrapper[4697]: I0220 16:48:25.339782 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:25 crc kubenswrapper[4697]: E0220 16:48:25.339950 4697 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 16:48:25 crc kubenswrapper[4697]: E0220 16:48:25.339995 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs podName:33e3fc43-dbfe-4fff-bac3-6021dfa84982 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:33.339981463 +0000 UTC m=+1021.120026871 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs") pod "openstack-operator-controller-manager-b45cc898b-82j7k" (UID: "33e3fc43-dbfe-4fff-bac3-6021dfa84982") : secret "webhook-server-cert" not found Feb 20 16:48:25 crc kubenswrapper[4697]: E0220 16:48:25.340290 4697 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 16:48:25 crc kubenswrapper[4697]: E0220 16:48:25.340359 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs podName:33e3fc43-dbfe-4fff-bac3-6021dfa84982 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:33.340346033 +0000 UTC m=+1021.120391441 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs") pod "openstack-operator-controller-manager-b45cc898b-82j7k" (UID: "33e3fc43-dbfe-4fff-bac3-6021dfa84982") : secret "metrics-server-cert" not found Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.511446 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm" event={"ID":"b0605f71-51c5-49d9-8936-77affb7cf0bf","Type":"ContainerStarted","Data":"6f77791956f1052427dc7946059ac56eb2f56a5cbc6cfdf02321c8200abcdca0"} Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.512194 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.513092 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll" event={"ID":"8c8d2c10-e4b6-4d37-977f-7ad685981d2f","Type":"ContainerStarted","Data":"8096fabfdc5d240bc5542463a119cb7fb484d6399c5c2bb916dbbdca583d2b6f"} Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.513182 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.514456 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk" event={"ID":"21c4fbfe-19b7-4303-b4bf-72dbc90044dd","Type":"ContainerStarted","Data":"673c4022f2ad0f1cace2370f1de035541f53f314e3b243312e301e9d9365b019"} Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.514560 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.515549 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5" event={"ID":"f56f8133-887c-456a-9cbf-6df7713789b3","Type":"ContainerStarted","Data":"7e50dfde27283c2cb566362bc021f9ef8a5ce489b52507c2ce6056e7d6d1b7d9"} Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.515602 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.516902 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9" event={"ID":"0b5f03ab-32bb-48ef-b7d7-1ede5fb51924","Type":"ContainerStarted","Data":"aa66f42380c39800476cf65f3d24cd52ea7fabc3e9cf2b1f98cd37ace374a204"} Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.517047 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.518332 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns" event={"ID":"366fd13a-060b-4572-9541-dbf88a507588","Type":"ContainerStarted","Data":"92e7623e23ebf9ef638284b4a282fce023f331689bf3b2c5c570cc5fff6242b6"} Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.518467 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.519899 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5" event={"ID":"71f3f3ad-1f6c-4d59-9fc8-b036014c1068","Type":"ContainerStarted","Data":"872b3be854bf19fbd1ae2858194d368529989980173a5e87b89a6e78941f6180"} Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.519973 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.521099 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq" event={"ID":"d233b891-dac3-4565-971b-85141828260d","Type":"ContainerStarted","Data":"c3998c200c9e5806aa8b57a5b2919d12b5e510b60b213f9d09a7118d82dbaaf5"} Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.521143 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.522465 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5" event={"ID":"4fdb933f-aa86-4b88-9b08-4783ce0f6e0c","Type":"ContainerStarted","Data":"a5ce747878d6c0eb3a07d3de68d9149cb760477cdbaf80a1564fd725f729564a"} Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.522598 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.523658 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j" event={"ID":"d9926f6e-afca-48ad-8a52-fb7f53ba3dec","Type":"ContainerStarted","Data":"d2bae0d6631a9eb51c74ff8117adeb7ff68367e4db08a429498209441543c986"} Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.523710 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.525269 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr" event={"ID":"fcf00eef-940f-4da3-8359-325f1abb0c6d","Type":"ContainerStarted","Data":"b218ba552b81540a5f8a1a984bcb7a7799a2c3b7ea7b8c4164be614b5d33e559"} Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.525378 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.526186 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8" event={"ID":"560f1df6-c03f-42ad-8175-5508f56e1ecc","Type":"ContainerStarted","Data":"786148170bb4879cd061cde0d767a332a1b27c043c4a0ea81c350ee0fe479d56"} Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.526378 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.550982 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm" podStartSLOduration=3.215230575 podStartE2EDuration="13.550965981s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.375050528 +0000 UTC m=+1006.155095936" lastFinishedPulling="2026-02-20 16:48:28.710785934 +0000 UTC m=+1016.490831342" observedRunningTime="2026-02-20 16:48:29.547144924 +0000 UTC m=+1017.327190332" watchObservedRunningTime="2026-02-20 16:48:29.550965981 +0000 UTC m=+1017.331011389" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.577602 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5" podStartSLOduration=3.202620713 podStartE2EDuration="13.577587177s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.336461167 +0000 UTC m=+1006.116506575" lastFinishedPulling="2026-02-20 16:48:28.711427631 +0000 UTC m=+1016.491473039" observedRunningTime="2026-02-20 16:48:29.574747245 +0000 UTC m=+1017.354792653" watchObservedRunningTime="2026-02-20 16:48:29.577587177 +0000 UTC m=+1017.357632585" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.607588 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j" podStartSLOduration=3.5932923089999997 podStartE2EDuration="13.607572349s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.697146551 +0000 UTC m=+1006.477191959" lastFinishedPulling="2026-02-20 16:48:28.711426591 +0000 UTC m=+1016.491471999" observedRunningTime="2026-02-20 16:48:29.607252401 +0000 UTC m=+1017.387297809" watchObservedRunningTime="2026-02-20 16:48:29.607572349 +0000 UTC m=+1017.387617757" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.634580 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr" podStartSLOduration=3.212073505 podStartE2EDuration="13.634564685s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.288378696 +0000 UTC m=+1006.068424104" lastFinishedPulling="2026-02-20 16:48:28.710869836 +0000 UTC m=+1016.490915284" observedRunningTime="2026-02-20 16:48:29.627995868 +0000 UTC m=+1017.408041276" watchObservedRunningTime="2026-02-20 16:48:29.634564685 +0000 UTC m=+1017.414610093" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.648210 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll" podStartSLOduration=3.478290038 podStartE2EDuration="13.648192781s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.540755028 +0000 UTC m=+1006.320800446" lastFinishedPulling="2026-02-20 16:48:28.710657781 +0000 UTC m=+1016.490703189" observedRunningTime="2026-02-20 16:48:29.64383351 +0000 UTC m=+1017.423878928" watchObservedRunningTime="2026-02-20 16:48:29.648192781 +0000 UTC m=+1017.428238189" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.663874 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8" podStartSLOduration=3.318591141 podStartE2EDuration="13.663854379s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.374620557 +0000 UTC m=+1006.154665965" lastFinishedPulling="2026-02-20 16:48:28.719883785 +0000 UTC m=+1016.499929203" observedRunningTime="2026-02-20 16:48:29.65995708 +0000 UTC m=+1017.440002488" watchObservedRunningTime="2026-02-20 16:48:29.663854379 +0000 UTC m=+1017.443899787" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.682747 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk" podStartSLOduration=3.48825113 podStartE2EDuration="13.682732698s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.516123112 +0000 UTC m=+1006.296168530" lastFinishedPulling="2026-02-20 16:48:28.71060467 +0000 UTC m=+1016.490650098" observedRunningTime="2026-02-20 16:48:29.679945558 +0000 UTC m=+1017.459990966" watchObservedRunningTime="2026-02-20 16:48:29.682732698 +0000 UTC m=+1017.462778106" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.755176 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9" podStartSLOduration=3.335044618 podStartE2EDuration="13.755161759s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.290491909 +0000 UTC m=+1006.070537317" lastFinishedPulling="2026-02-20 16:48:28.71060904 +0000 UTC m=+1016.490654458" observedRunningTime="2026-02-20 16:48:29.735476488 +0000 UTC m=+1017.515521906" watchObservedRunningTime="2026-02-20 16:48:29.755161759 +0000 UTC m=+1017.535207167" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.755840 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5" podStartSLOduration=3.365931974 podStartE2EDuration="13.755834246s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.323367685 +0000 UTC m=+1006.103413093" lastFinishedPulling="2026-02-20 16:48:28.713269947 +0000 UTC m=+1016.493315365" observedRunningTime="2026-02-20 16:48:29.754600374 +0000 UTC m=+1017.534645772" watchObservedRunningTime="2026-02-20 16:48:29.755834246 +0000 UTC m=+1017.535879654" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.781193 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq" podStartSLOduration=3.445176237 podStartE2EDuration="13.7811749s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.375367496 +0000 UTC m=+1006.155412904" lastFinishedPulling="2026-02-20 16:48:28.711366159 +0000 UTC m=+1016.491411567" observedRunningTime="2026-02-20 16:48:29.776336477 +0000 UTC m=+1017.556381885" watchObservedRunningTime="2026-02-20 16:48:29.7811749 +0000 UTC m=+1017.561220308" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.829100 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5" podStartSLOduration=3.376288126 podStartE2EDuration="13.829085457s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.368011869 +0000 UTC m=+1006.148057297" lastFinishedPulling="2026-02-20 16:48:28.82080921 +0000 UTC m=+1016.600854628" observedRunningTime="2026-02-20 16:48:29.821670118 +0000 UTC m=+1017.601715526" watchObservedRunningTime="2026-02-20 16:48:29.829085457 +0000 UTC m=+1017.609130865" Feb 20 16:48:29 crc kubenswrapper[4697]: I0220 16:48:29.867296 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns" podStartSLOduration=3.5300968839999998 podStartE2EDuration="13.867275647s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.37513392 +0000 UTC m=+1006.155179328" lastFinishedPulling="2026-02-20 16:48:28.712312673 +0000 UTC m=+1016.492358091" observedRunningTime="2026-02-20 16:48:29.85398956 +0000 UTC m=+1017.634034968" watchObservedRunningTime="2026-02-20 16:48:29.867275647 +0000 UTC m=+1017.647321055" Feb 20 16:48:32 crc kubenswrapper[4697]: I0220 16:48:32.748514 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert\") pod \"infra-operator-controller-manager-79d975b745-j9lmt\" (UID: \"d30d696e-1555-4fc2-9316-c795de608048\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:32 crc kubenswrapper[4697]: I0220 16:48:32.767256 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30d696e-1555-4fc2-9316-c795de608048-cert\") pod \"infra-operator-controller-manager-79d975b745-j9lmt\" (UID: \"d30d696e-1555-4fc2-9316-c795de608048\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:33 crc kubenswrapper[4697]: I0220 16:48:33.020278 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-h8cgm" Feb 20 16:48:33 crc kubenswrapper[4697]: I0220 16:48:33.029034 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:33 crc kubenswrapper[4697]: I0220 16:48:33.059708 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8\" (UID: \"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:33 crc kubenswrapper[4697]: I0220 16:48:33.065688 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f3b7ed7-e806-4fa9-ac88-381c0b4bd237-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8\" (UID: \"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:33 crc kubenswrapper[4697]: I0220 16:48:33.099918 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jh8v8" Feb 20 16:48:33 crc kubenswrapper[4697]: I0220 16:48:33.108784 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:33 crc kubenswrapper[4697]: I0220 16:48:33.371663 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:33 crc kubenswrapper[4697]: I0220 16:48:33.372051 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:33 crc kubenswrapper[4697]: E0220 16:48:33.372203 4697 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 16:48:33 crc kubenswrapper[4697]: E0220 16:48:33.372254 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs podName:33e3fc43-dbfe-4fff-bac3-6021dfa84982 nodeName:}" failed. No retries permitted until 2026-02-20 16:48:49.372241737 +0000 UTC m=+1037.152287145 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs") pod "openstack-operator-controller-manager-b45cc898b-82j7k" (UID: "33e3fc43-dbfe-4fff-bac3-6021dfa84982") : secret "webhook-server-cert" not found Feb 20 16:48:33 crc kubenswrapper[4697]: I0220 16:48:33.376041 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-metrics-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:33 crc kubenswrapper[4697]: I0220 16:48:33.513996 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt"] Feb 20 16:48:33 crc kubenswrapper[4697]: W0220 16:48:33.714310 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd30d696e_1555_4fc2_9316_c795de608048.slice/crio-8bfcd3a1c533dddc337cda6eb24eb897b1d50a260204de2740276aa8aba214a1 WatchSource:0}: Error finding container 8bfcd3a1c533dddc337cda6eb24eb897b1d50a260204de2740276aa8aba214a1: Status 404 returned error can't find the container with id 8bfcd3a1c533dddc337cda6eb24eb897b1d50a260204de2740276aa8aba214a1 Feb 20 16:48:34 crc kubenswrapper[4697]: I0220 16:48:34.561102 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" event={"ID":"d30d696e-1555-4fc2-9316-c795de608048","Type":"ContainerStarted","Data":"8bfcd3a1c533dddc337cda6eb24eb897b1d50a260204de2740276aa8aba214a1"} Feb 20 16:48:36 crc kubenswrapper[4697]: I0220 16:48:36.976262 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-kqjzr" Feb 20 16:48:37 crc kubenswrapper[4697]: I0220 16:48:37.008925 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-9zdp9" Feb 20 16:48:37 crc kubenswrapper[4697]: I0220 16:48:37.023828 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-jxdb5" Feb 20 16:48:37 crc kubenswrapper[4697]: I0220 16:48:37.061908 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5xkg5" Feb 20 16:48:37 crc kubenswrapper[4697]: I0220 16:48:37.086340 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qjfrm" Feb 20 16:48:37 crc kubenswrapper[4697]: I0220 16:48:37.091423 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4nqmq" Feb 20 16:48:37 crc kubenswrapper[4697]: I0220 16:48:37.152535 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-rw4q5" Feb 20 16:48:37 crc kubenswrapper[4697]: I0220 16:48:37.225096 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-dknf8" Feb 20 16:48:37 crc kubenswrapper[4697]: I0220 16:48:37.258660 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-gsmns" Feb 20 16:48:37 crc kubenswrapper[4697]: I0220 16:48:37.387951 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ct4nk" Feb 20 16:48:37 crc kubenswrapper[4697]: I0220 16:48:37.442120 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-82v6j" Feb 20 16:48:37 crc kubenswrapper[4697]: I0220 16:48:37.541598 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-l5qll" Feb 20 16:48:38 crc kubenswrapper[4697]: I0220 16:48:38.189333 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8"] Feb 20 16:48:39 crc kubenswrapper[4697]: I0220 16:48:39.598270 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" event={"ID":"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237","Type":"ContainerStarted","Data":"69e448ef1f966fdfd8d3c69ed89c4dd606df01ede93bc3a97c9e278d3a2d75d6"} Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.642331 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" event={"ID":"cae8d6b1-4649-40bb-b710-1197ac78db1b","Type":"ContainerStarted","Data":"d16de71d774fa68a204cb56daf0f980bcdcb217034829f74ce96b60b37f7c623"} Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.643972 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.648901 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" event={"ID":"d30d696e-1555-4fc2-9316-c795de608048","Type":"ContainerStarted","Data":"9cfe563cc268dc149650d0be91de333eeb5379c269bf05a4bb01e603eed1c9fb"} Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.649275 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.661093 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" event={"ID":"d8c72591-eb8d-4553-867d-60482d51c4db","Type":"ContainerStarted","Data":"12208999aab3128d527031820b30611d92548d3084032fcdcbef939fd1406637"} Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.661556 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.669984 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" event={"ID":"8b48098d-ef4c-4cde-beef-a7c34573699b","Type":"ContainerStarted","Data":"94ff0d80af5d946d6ef01bf7a72e1b9c743e8fbe5b1672d662874b5ac3b9f74a"} Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.670376 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.671865 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" event={"ID":"a675eb01-18af-4776-94e6-64c0b392248b","Type":"ContainerStarted","Data":"53938af907da0b7e1aa71205c47880ba67ae2473f54c7fbe92d9b9735d28a9dc"} Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.672032 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.674860 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" event={"ID":"55a77be1-9486-4b8a-acc6-a4d8532016d3","Type":"ContainerStarted","Data":"d6a3da605eafcdff28e6f910fb308cb39ff5115cd6c6d6fe1e8eb42a1b352a5d"} Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.675003 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.676179 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" podStartSLOduration=3.487644505 podStartE2EDuration="24.676171296s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.559274188 +0000 UTC m=+1006.339319596" lastFinishedPulling="2026-02-20 16:48:39.747800979 +0000 UTC m=+1027.527846387" observedRunningTime="2026-02-20 16:48:40.675000417 +0000 UTC m=+1028.455045825" watchObservedRunningTime="2026-02-20 16:48:40.676171296 +0000 UTC m=+1028.456216704" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.678508 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" event={"ID":"cb67b5ad-353e-4d96-8d94-fc69e4801f64","Type":"ContainerStarted","Data":"9a25b8dc994303f526efc6503110b34998f9d1999d2e9f377193dbf37ca4dfdc"} Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.678766 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.688648 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" event={"ID":"6af9a0a9-0546-4a59-bdca-1a0609421010","Type":"ContainerStarted","Data":"3c8db0f89881b51fb60f30dc5a6b828eddd0ad3bb7551467500ba018810a0bc0"} Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.689320 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.714116 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" podStartSLOduration=18.652328591 podStartE2EDuration="24.71409971s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:33.716638398 +0000 UTC m=+1021.496683806" lastFinishedPulling="2026-02-20 16:48:39.778409517 +0000 UTC m=+1027.558454925" observedRunningTime="2026-02-20 16:48:40.712672374 +0000 UTC m=+1028.492717782" watchObservedRunningTime="2026-02-20 16:48:40.71409971 +0000 UTC m=+1028.494145118" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.739360 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" podStartSLOduration=4.753196518 podStartE2EDuration="24.739345231s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.680869188 +0000 UTC m=+1006.460914596" lastFinishedPulling="2026-02-20 16:48:38.667017901 +0000 UTC m=+1026.447063309" observedRunningTime="2026-02-20 16:48:40.737481684 +0000 UTC m=+1028.517527092" watchObservedRunningTime="2026-02-20 16:48:40.739345231 +0000 UTC m=+1028.519390639" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.780629 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" podStartSLOduration=3.306119551 podStartE2EDuration="23.78061188s" podCreationTimestamp="2026-02-20 16:48:17 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.711022674 +0000 UTC m=+1006.491068072" lastFinishedPulling="2026-02-20 16:48:39.185514953 +0000 UTC m=+1026.965560401" observedRunningTime="2026-02-20 16:48:40.779909482 +0000 UTC m=+1028.559954890" watchObservedRunningTime="2026-02-20 16:48:40.78061188 +0000 UTC m=+1028.560657288" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.830875 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" podStartSLOduration=3.59172697 podStartE2EDuration="24.830858707s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.551579043 +0000 UTC m=+1006.331624451" lastFinishedPulling="2026-02-20 16:48:39.79071078 +0000 UTC m=+1027.570756188" observedRunningTime="2026-02-20 16:48:40.808122599 +0000 UTC m=+1028.588168007" watchObservedRunningTime="2026-02-20 16:48:40.830858707 +0000 UTC m=+1028.610904115" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.858851 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" podStartSLOduration=2.767159017 podStartE2EDuration="23.858826997s" podCreationTimestamp="2026-02-20 16:48:17 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.686786188 +0000 UTC m=+1006.466831586" lastFinishedPulling="2026-02-20 16:48:39.778454158 +0000 UTC m=+1027.558499566" observedRunningTime="2026-02-20 16:48:40.83455326 +0000 UTC m=+1028.614598668" watchObservedRunningTime="2026-02-20 16:48:40.858826997 +0000 UTC m=+1028.638872405" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.882948 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" podStartSLOduration=2.84794581 podStartE2EDuration="23.88293048s" podCreationTimestamp="2026-02-20 16:48:17 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.718579966 +0000 UTC m=+1006.498625374" lastFinishedPulling="2026-02-20 16:48:39.753564636 +0000 UTC m=+1027.533610044" observedRunningTime="2026-02-20 16:48:40.881809771 +0000 UTC m=+1028.661855169" watchObservedRunningTime="2026-02-20 16:48:40.88293048 +0000 UTC m=+1028.662975888" Feb 20 16:48:40 crc kubenswrapper[4697]: I0220 16:48:40.883757 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" podStartSLOduration=3.838923861 podStartE2EDuration="24.883753551s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.708531161 +0000 UTC m=+1006.488576569" lastFinishedPulling="2026-02-20 16:48:39.753360851 +0000 UTC m=+1027.533406259" observedRunningTime="2026-02-20 16:48:40.860751066 +0000 UTC m=+1028.640796474" watchObservedRunningTime="2026-02-20 16:48:40.883753551 +0000 UTC m=+1028.663798949" Feb 20 16:48:47 crc kubenswrapper[4697]: I0220 16:48:47.449548 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4mtzl" Feb 20 16:48:47 crc kubenswrapper[4697]: I0220 16:48:47.506092 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-7jlc2" Feb 20 16:48:47 crc kubenswrapper[4697]: I0220 16:48:47.537194 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-lsgnq" Feb 20 16:48:47 crc kubenswrapper[4697]: I0220 16:48:47.573190 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-2mxvz" Feb 20 16:48:47 crc kubenswrapper[4697]: I0220 16:48:47.605207 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-b89r4" Feb 20 16:48:47 crc kubenswrapper[4697]: I0220 16:48:47.630306 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-wjwzl" Feb 20 16:48:47 crc kubenswrapper[4697]: I0220 16:48:47.767022 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-9d9d9f9cd-t7nvz" Feb 20 16:48:49 crc kubenswrapper[4697]: I0220 16:48:49.424288 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:49 crc kubenswrapper[4697]: I0220 16:48:49.435949 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33e3fc43-dbfe-4fff-bac3-6021dfa84982-webhook-certs\") pod \"openstack-operator-controller-manager-b45cc898b-82j7k\" (UID: \"33e3fc43-dbfe-4fff-bac3-6021dfa84982\") " pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:49 crc kubenswrapper[4697]: I0220 16:48:49.709189 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-tn462" Feb 20 16:48:49 crc kubenswrapper[4697]: I0220 16:48:49.718317 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:49 crc kubenswrapper[4697]: I0220 16:48:49.773578 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms" event={"ID":"8546e4ea-d7f0-4244-8496-e962809c4203","Type":"ContainerStarted","Data":"a3e80832ad87149d2a59b7bca94253d2d95e5e5bdf3585d55ada7b5c77dce5e7"} Feb 20 16:48:49 crc kubenswrapper[4697]: I0220 16:48:49.783564 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" event={"ID":"3f3b7ed7-e806-4fa9-ac88-381c0b4bd237","Type":"ContainerStarted","Data":"8603d292fe299411ff5839ac3e13a176da41c968aabd8760a65b8a18d07b106d"} Feb 20 16:48:49 crc kubenswrapper[4697]: I0220 16:48:49.783750 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:49 crc kubenswrapper[4697]: I0220 16:48:49.806042 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnpms" podStartSLOduration=2.80667146 podStartE2EDuration="32.806020786s" podCreationTimestamp="2026-02-20 16:48:17 +0000 UTC" firstStartedPulling="2026-02-20 16:48:18.713555248 +0000 UTC m=+1006.493600656" lastFinishedPulling="2026-02-20 16:48:48.712904554 +0000 UTC m=+1036.492949982" observedRunningTime="2026-02-20 16:48:49.799762378 +0000 UTC m=+1037.579807816" watchObservedRunningTime="2026-02-20 16:48:49.806020786 +0000 UTC m=+1037.586066214" Feb 20 16:48:49 crc kubenswrapper[4697]: I0220 16:48:49.835756 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" podStartSLOduration=24.344856169 podStartE2EDuration="33.835735431s" podCreationTimestamp="2026-02-20 16:48:16 +0000 UTC" firstStartedPulling="2026-02-20 16:48:39.216535872 +0000 UTC m=+1026.996581280" lastFinishedPulling="2026-02-20 16:48:48.707415134 +0000 UTC m=+1036.487460542" observedRunningTime="2026-02-20 16:48:49.828497858 +0000 UTC m=+1037.608543306" watchObservedRunningTime="2026-02-20 16:48:49.835735431 +0000 UTC m=+1037.615780839" Feb 20 16:48:50 crc kubenswrapper[4697]: I0220 16:48:50.204719 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k"] Feb 20 16:48:50 crc kubenswrapper[4697]: I0220 16:48:50.793142 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" event={"ID":"33e3fc43-dbfe-4fff-bac3-6021dfa84982","Type":"ContainerStarted","Data":"3e03c545a38ea664114db8968a10bc7b175b2acf6f0da5c47c8fd566df8460e4"} Feb 20 16:48:50 crc kubenswrapper[4697]: I0220 16:48:50.793720 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" event={"ID":"33e3fc43-dbfe-4fff-bac3-6021dfa84982","Type":"ContainerStarted","Data":"c352c09c270dcae845b2992ab09b6eb3f304ccd09da75ea9917a36314f9ce1de"} Feb 20 16:48:50 crc kubenswrapper[4697]: I0220 16:48:50.820657 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" podStartSLOduration=33.820629814 podStartE2EDuration="33.820629814s" podCreationTimestamp="2026-02-20 16:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:48:50.814539929 +0000 UTC m=+1038.594585367" watchObservedRunningTime="2026-02-20 16:48:50.820629814 +0000 UTC m=+1038.600675262" Feb 20 16:48:51 crc kubenswrapper[4697]: I0220 16:48:51.800500 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:48:53 crc kubenswrapper[4697]: I0220 16:48:53.039785 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-j9lmt" Feb 20 16:48:53 crc kubenswrapper[4697]: I0220 16:48:53.118851 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8" Feb 20 16:48:59 crc kubenswrapper[4697]: I0220 16:48:59.724257 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-b45cc898b-82j7k" Feb 20 16:49:01 crc kubenswrapper[4697]: I0220 16:49:01.184942 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:49:01 crc kubenswrapper[4697]: I0220 16:49:01.185027 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.745242 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b4cc7db87-7bvg8"] Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.746905 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.749391 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.749641 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.749774 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.749998 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rb56h" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.763703 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b4cc7db87-7bvg8"] Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.802224 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgs68\" (UniqueName: \"kubernetes.io/projected/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0-kube-api-access-tgs68\") pod \"dnsmasq-dns-7b4cc7db87-7bvg8\" (UID: \"b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0\") " pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.802295 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0-config\") pod \"dnsmasq-dns-7b4cc7db87-7bvg8\" (UID: \"b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0\") " pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.834064 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68cf5d8cfc-jjkq6"] Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.835464 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.838109 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.852891 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68cf5d8cfc-jjkq6"] Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.903023 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwfvr\" (UniqueName: \"kubernetes.io/projected/d8193e90-bf7e-4b08-8313-267825fd94a1-kube-api-access-vwfvr\") pod \"dnsmasq-dns-68cf5d8cfc-jjkq6\" (UID: \"d8193e90-bf7e-4b08-8313-267825fd94a1\") " pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.903102 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8193e90-bf7e-4b08-8313-267825fd94a1-dns-svc\") pod \"dnsmasq-dns-68cf5d8cfc-jjkq6\" (UID: \"d8193e90-bf7e-4b08-8313-267825fd94a1\") " pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.903136 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgs68\" (UniqueName: \"kubernetes.io/projected/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0-kube-api-access-tgs68\") pod \"dnsmasq-dns-7b4cc7db87-7bvg8\" (UID: \"b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0\") " pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.903158 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8193e90-bf7e-4b08-8313-267825fd94a1-config\") pod \"dnsmasq-dns-68cf5d8cfc-jjkq6\" (UID: \"d8193e90-bf7e-4b08-8313-267825fd94a1\") " pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.903179 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0-config\") pod \"dnsmasq-dns-7b4cc7db87-7bvg8\" (UID: \"b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0\") " pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.904134 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0-config\") pod \"dnsmasq-dns-7b4cc7db87-7bvg8\" (UID: \"b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0\") " pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" Feb 20 16:49:18 crc kubenswrapper[4697]: I0220 16:49:18.957028 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgs68\" (UniqueName: \"kubernetes.io/projected/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0-kube-api-access-tgs68\") pod \"dnsmasq-dns-7b4cc7db87-7bvg8\" (UID: \"b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0\") " pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" Feb 20 16:49:19 crc kubenswrapper[4697]: I0220 16:49:19.004229 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8193e90-bf7e-4b08-8313-267825fd94a1-dns-svc\") pod \"dnsmasq-dns-68cf5d8cfc-jjkq6\" (UID: \"d8193e90-bf7e-4b08-8313-267825fd94a1\") " pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:19 crc kubenswrapper[4697]: I0220 16:49:19.004298 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8193e90-bf7e-4b08-8313-267825fd94a1-config\") pod \"dnsmasq-dns-68cf5d8cfc-jjkq6\" (UID: \"d8193e90-bf7e-4b08-8313-267825fd94a1\") " pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:19 crc kubenswrapper[4697]: I0220 16:49:19.004373 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwfvr\" (UniqueName: \"kubernetes.io/projected/d8193e90-bf7e-4b08-8313-267825fd94a1-kube-api-access-vwfvr\") pod \"dnsmasq-dns-68cf5d8cfc-jjkq6\" (UID: \"d8193e90-bf7e-4b08-8313-267825fd94a1\") " pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:19 crc kubenswrapper[4697]: I0220 16:49:19.005707 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8193e90-bf7e-4b08-8313-267825fd94a1-dns-svc\") pod \"dnsmasq-dns-68cf5d8cfc-jjkq6\" (UID: \"d8193e90-bf7e-4b08-8313-267825fd94a1\") " pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:19 crc kubenswrapper[4697]: I0220 16:49:19.006412 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8193e90-bf7e-4b08-8313-267825fd94a1-config\") pod \"dnsmasq-dns-68cf5d8cfc-jjkq6\" (UID: \"d8193e90-bf7e-4b08-8313-267825fd94a1\") " pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:19 crc kubenswrapper[4697]: I0220 16:49:19.025695 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwfvr\" (UniqueName: \"kubernetes.io/projected/d8193e90-bf7e-4b08-8313-267825fd94a1-kube-api-access-vwfvr\") pod \"dnsmasq-dns-68cf5d8cfc-jjkq6\" (UID: \"d8193e90-bf7e-4b08-8313-267825fd94a1\") " pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:19 crc kubenswrapper[4697]: I0220 16:49:19.066226 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" Feb 20 16:49:19 crc kubenswrapper[4697]: I0220 16:49:19.158300 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:19 crc kubenswrapper[4697]: W0220 16:49:19.513317 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8fbd3ad_9df8_4a65_b37b_e4b63374c9f0.slice/crio-66cf3903b0f26b4c9a9aeecff6891c5ea5553332b6175d45776c351c6e63ca25 WatchSource:0}: Error finding container 66cf3903b0f26b4c9a9aeecff6891c5ea5553332b6175d45776c351c6e63ca25: Status 404 returned error can't find the container with id 66cf3903b0f26b4c9a9aeecff6891c5ea5553332b6175d45776c351c6e63ca25 Feb 20 16:49:19 crc kubenswrapper[4697]: I0220 16:49:19.517475 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b4cc7db87-7bvg8"] Feb 20 16:49:19 crc kubenswrapper[4697]: I0220 16:49:19.574241 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68cf5d8cfc-jjkq6"] Feb 20 16:49:19 crc kubenswrapper[4697]: W0220 16:49:19.574823 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8193e90_bf7e_4b08_8313_267825fd94a1.slice/crio-df6f6fae13523092b8a46e276d873f8a39189810f4cdaa01bb6d343ac2d9d326 WatchSource:0}: Error finding container df6f6fae13523092b8a46e276d873f8a39189810f4cdaa01bb6d343ac2d9d326: Status 404 returned error can't find the container with id df6f6fae13523092b8a46e276d873f8a39189810f4cdaa01bb6d343ac2d9d326 Feb 20 16:49:20 crc kubenswrapper[4697]: I0220 16:49:20.026989 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" event={"ID":"b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0","Type":"ContainerStarted","Data":"66cf3903b0f26b4c9a9aeecff6891c5ea5553332b6175d45776c351c6e63ca25"} Feb 20 16:49:20 crc kubenswrapper[4697]: I0220 16:49:20.028217 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" event={"ID":"d8193e90-bf7e-4b08-8313-267825fd94a1","Type":"ContainerStarted","Data":"df6f6fae13523092b8a46e276d873f8a39189810f4cdaa01bb6d343ac2d9d326"} Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.499074 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68cf5d8cfc-jjkq6"] Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.517493 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d69579867-w4bmk"] Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.518639 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.527957 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d69579867-w4bmk"] Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.656132 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6840114-f1c7-41d8-a497-7401c93a5b5c-dns-svc\") pod \"dnsmasq-dns-5d69579867-w4bmk\" (UID: \"b6840114-f1c7-41d8-a497-7401c93a5b5c\") " pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.656402 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6840114-f1c7-41d8-a497-7401c93a5b5c-config\") pod \"dnsmasq-dns-5d69579867-w4bmk\" (UID: \"b6840114-f1c7-41d8-a497-7401c93a5b5c\") " pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.656517 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr6gq\" (UniqueName: \"kubernetes.io/projected/b6840114-f1c7-41d8-a497-7401c93a5b5c-kube-api-access-sr6gq\") pod \"dnsmasq-dns-5d69579867-w4bmk\" (UID: \"b6840114-f1c7-41d8-a497-7401c93a5b5c\") " pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.758420 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6840114-f1c7-41d8-a497-7401c93a5b5c-dns-svc\") pod \"dnsmasq-dns-5d69579867-w4bmk\" (UID: \"b6840114-f1c7-41d8-a497-7401c93a5b5c\") " pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.758737 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6840114-f1c7-41d8-a497-7401c93a5b5c-config\") pod \"dnsmasq-dns-5d69579867-w4bmk\" (UID: \"b6840114-f1c7-41d8-a497-7401c93a5b5c\") " pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.758773 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr6gq\" (UniqueName: \"kubernetes.io/projected/b6840114-f1c7-41d8-a497-7401c93a5b5c-kube-api-access-sr6gq\") pod \"dnsmasq-dns-5d69579867-w4bmk\" (UID: \"b6840114-f1c7-41d8-a497-7401c93a5b5c\") " pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.759825 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6840114-f1c7-41d8-a497-7401c93a5b5c-config\") pod \"dnsmasq-dns-5d69579867-w4bmk\" (UID: \"b6840114-f1c7-41d8-a497-7401c93a5b5c\") " pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.760208 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6840114-f1c7-41d8-a497-7401c93a5b5c-dns-svc\") pod \"dnsmasq-dns-5d69579867-w4bmk\" (UID: \"b6840114-f1c7-41d8-a497-7401c93a5b5c\") " pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.785255 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b4cc7db87-7bvg8"] Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.787727 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr6gq\" (UniqueName: \"kubernetes.io/projected/b6840114-f1c7-41d8-a497-7401c93a5b5c-kube-api-access-sr6gq\") pod \"dnsmasq-dns-5d69579867-w4bmk\" (UID: \"b6840114-f1c7-41d8-a497-7401c93a5b5c\") " pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.803844 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56b95674c9-gvkl2"] Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.804958 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.856112 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56b95674c9-gvkl2"] Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.890276 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.963022 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed01b774-208a-47af-9e73-08531f4d5e6c-config\") pod \"dnsmasq-dns-56b95674c9-gvkl2\" (UID: \"ed01b774-208a-47af-9e73-08531f4d5e6c\") " pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.963117 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed01b774-208a-47af-9e73-08531f4d5e6c-dns-svc\") pod \"dnsmasq-dns-56b95674c9-gvkl2\" (UID: \"ed01b774-208a-47af-9e73-08531f4d5e6c\") " pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:22 crc kubenswrapper[4697]: I0220 16:49:22.963136 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fncw\" (UniqueName: \"kubernetes.io/projected/ed01b774-208a-47af-9e73-08531f4d5e6c-kube-api-access-8fncw\") pod \"dnsmasq-dns-56b95674c9-gvkl2\" (UID: \"ed01b774-208a-47af-9e73-08531f4d5e6c\") " pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.064164 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed01b774-208a-47af-9e73-08531f4d5e6c-config\") pod \"dnsmasq-dns-56b95674c9-gvkl2\" (UID: \"ed01b774-208a-47af-9e73-08531f4d5e6c\") " pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.064277 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed01b774-208a-47af-9e73-08531f4d5e6c-dns-svc\") pod \"dnsmasq-dns-56b95674c9-gvkl2\" (UID: \"ed01b774-208a-47af-9e73-08531f4d5e6c\") " pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.064319 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fncw\" (UniqueName: \"kubernetes.io/projected/ed01b774-208a-47af-9e73-08531f4d5e6c-kube-api-access-8fncw\") pod \"dnsmasq-dns-56b95674c9-gvkl2\" (UID: \"ed01b774-208a-47af-9e73-08531f4d5e6c\") " pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.065948 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed01b774-208a-47af-9e73-08531f4d5e6c-dns-svc\") pod \"dnsmasq-dns-56b95674c9-gvkl2\" (UID: \"ed01b774-208a-47af-9e73-08531f4d5e6c\") " pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.066067 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed01b774-208a-47af-9e73-08531f4d5e6c-config\") pod \"dnsmasq-dns-56b95674c9-gvkl2\" (UID: \"ed01b774-208a-47af-9e73-08531f4d5e6c\") " pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.068847 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56b95674c9-gvkl2"] Feb 20 16:49:23 crc kubenswrapper[4697]: E0220 16:49:23.069517 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-8fncw], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" podUID="ed01b774-208a-47af-9e73-08531f4d5e6c" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.088882 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fncw\" (UniqueName: \"kubernetes.io/projected/ed01b774-208a-47af-9e73-08531f4d5e6c-kube-api-access-8fncw\") pod \"dnsmasq-dns-56b95674c9-gvkl2\" (UID: \"ed01b774-208a-47af-9e73-08531f4d5e6c\") " pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.095336 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b44f8bbb5-5v6fh"] Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.101939 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.107628 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b44f8bbb5-5v6fh"] Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.267722 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afbfdf08-d446-4739-b771-0244cb6001d9-config\") pod \"dnsmasq-dns-7b44f8bbb5-5v6fh\" (UID: \"afbfdf08-d446-4739-b771-0244cb6001d9\") " pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.267834 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzhzq\" (UniqueName: \"kubernetes.io/projected/afbfdf08-d446-4739-b771-0244cb6001d9-kube-api-access-tzhzq\") pod \"dnsmasq-dns-7b44f8bbb5-5v6fh\" (UID: \"afbfdf08-d446-4739-b771-0244cb6001d9\") " pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.267863 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afbfdf08-d446-4739-b771-0244cb6001d9-dns-svc\") pod \"dnsmasq-dns-7b44f8bbb5-5v6fh\" (UID: \"afbfdf08-d446-4739-b771-0244cb6001d9\") " pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.368517 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzhzq\" (UniqueName: \"kubernetes.io/projected/afbfdf08-d446-4739-b771-0244cb6001d9-kube-api-access-tzhzq\") pod \"dnsmasq-dns-7b44f8bbb5-5v6fh\" (UID: \"afbfdf08-d446-4739-b771-0244cb6001d9\") " pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.368558 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afbfdf08-d446-4739-b771-0244cb6001d9-dns-svc\") pod \"dnsmasq-dns-7b44f8bbb5-5v6fh\" (UID: \"afbfdf08-d446-4739-b771-0244cb6001d9\") " pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.368611 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afbfdf08-d446-4739-b771-0244cb6001d9-config\") pod \"dnsmasq-dns-7b44f8bbb5-5v6fh\" (UID: \"afbfdf08-d446-4739-b771-0244cb6001d9\") " pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.369313 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afbfdf08-d446-4739-b771-0244cb6001d9-config\") pod \"dnsmasq-dns-7b44f8bbb5-5v6fh\" (UID: \"afbfdf08-d446-4739-b771-0244cb6001d9\") " pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.369454 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afbfdf08-d446-4739-b771-0244cb6001d9-dns-svc\") pod \"dnsmasq-dns-7b44f8bbb5-5v6fh\" (UID: \"afbfdf08-d446-4739-b771-0244cb6001d9\") " pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.383460 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzhzq\" (UniqueName: \"kubernetes.io/projected/afbfdf08-d446-4739-b771-0244cb6001d9-kube-api-access-tzhzq\") pod \"dnsmasq-dns-7b44f8bbb5-5v6fh\" (UID: \"afbfdf08-d446-4739-b771-0244cb6001d9\") " pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.427232 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.669972 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.674212 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.676081 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-notifications-rabbitmq-svc" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.677129 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-server-conf" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.677269 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-plugins-conf" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.677412 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-config-data" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.677545 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-server-dockercfg-ltzkb" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.677714 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-erlang-cookie" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.677896 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-default-user" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.683160 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.773381 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/591f7e7d-78bf-43a5-afe2-119f93765311-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.773454 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/591f7e7d-78bf-43a5-afe2-119f93765311-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.773479 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/591f7e7d-78bf-43a5-afe2-119f93765311-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.773648 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/591f7e7d-78bf-43a5-afe2-119f93765311-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.773731 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/591f7e7d-78bf-43a5-afe2-119f93765311-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.773779 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/591f7e7d-78bf-43a5-afe2-119f93765311-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.773832 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/591f7e7d-78bf-43a5-afe2-119f93765311-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.773868 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/591f7e7d-78bf-43a5-afe2-119f93765311-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.773903 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.773921 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht9ql\" (UniqueName: \"kubernetes.io/projected/591f7e7d-78bf-43a5-afe2-119f93765311-kube-api-access-ht9ql\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.774020 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/591f7e7d-78bf-43a5-afe2-119f93765311-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.875011 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/591f7e7d-78bf-43a5-afe2-119f93765311-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.875059 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/591f7e7d-78bf-43a5-afe2-119f93765311-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.875084 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/591f7e7d-78bf-43a5-afe2-119f93765311-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.875110 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/591f7e7d-78bf-43a5-afe2-119f93765311-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.875132 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/591f7e7d-78bf-43a5-afe2-119f93765311-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.875156 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.875173 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht9ql\" (UniqueName: \"kubernetes.io/projected/591f7e7d-78bf-43a5-afe2-119f93765311-kube-api-access-ht9ql\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.875223 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/591f7e7d-78bf-43a5-afe2-119f93765311-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.875239 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/591f7e7d-78bf-43a5-afe2-119f93765311-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.875261 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/591f7e7d-78bf-43a5-afe2-119f93765311-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.875276 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/591f7e7d-78bf-43a5-afe2-119f93765311-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.876036 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/591f7e7d-78bf-43a5-afe2-119f93765311-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.876354 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/591f7e7d-78bf-43a5-afe2-119f93765311-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.876487 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/591f7e7d-78bf-43a5-afe2-119f93765311-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.876522 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/591f7e7d-78bf-43a5-afe2-119f93765311-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.877170 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/591f7e7d-78bf-43a5-afe2-119f93765311-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.877362 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.878389 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/591f7e7d-78bf-43a5-afe2-119f93765311-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.884872 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/591f7e7d-78bf-43a5-afe2-119f93765311-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.887427 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/591f7e7d-78bf-43a5-afe2-119f93765311-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.889525 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/591f7e7d-78bf-43a5-afe2-119f93765311-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.895496 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht9ql\" (UniqueName: \"kubernetes.io/projected/591f7e7d-78bf-43a5-afe2-119f93765311-kube-api-access-ht9ql\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.899845 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"591f7e7d-78bf-43a5-afe2-119f93765311\") " pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.945755 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.948591 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.951933 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.952333 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.952574 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.952789 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vxlxq" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.953016 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.953040 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.953105 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 20 16:49:23 crc kubenswrapper[4697]: I0220 16:49:23.964768 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.031743 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.053953 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.061172 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.077645 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.077729 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5224cc9f-d610-4ea0-94da-11cdb019dcce-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.077754 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.077781 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jvlw\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-kube-api-access-4jvlw\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.077894 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.077923 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.077962 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.078004 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.078055 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.078080 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5224cc9f-d610-4ea0-94da-11cdb019dcce-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.078097 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-config-data\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.178786 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fncw\" (UniqueName: \"kubernetes.io/projected/ed01b774-208a-47af-9e73-08531f4d5e6c-kube-api-access-8fncw\") pod \"ed01b774-208a-47af-9e73-08531f4d5e6c\" (UID: \"ed01b774-208a-47af-9e73-08531f4d5e6c\") " Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.178896 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed01b774-208a-47af-9e73-08531f4d5e6c-config\") pod \"ed01b774-208a-47af-9e73-08531f4d5e6c\" (UID: \"ed01b774-208a-47af-9e73-08531f4d5e6c\") " Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.178980 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed01b774-208a-47af-9e73-08531f4d5e6c-dns-svc\") pod \"ed01b774-208a-47af-9e73-08531f4d5e6c\" (UID: \"ed01b774-208a-47af-9e73-08531f4d5e6c\") " Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179205 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179252 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179284 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5224cc9f-d610-4ea0-94da-11cdb019dcce-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179310 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-config-data\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179332 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179379 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5224cc9f-d610-4ea0-94da-11cdb019dcce-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179401 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179458 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jvlw\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-kube-api-access-4jvlw\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179459 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed01b774-208a-47af-9e73-08531f4d5e6c-config" (OuterVolumeSpecName: "config") pod "ed01b774-208a-47af-9e73-08531f4d5e6c" (UID: "ed01b774-208a-47af-9e73-08531f4d5e6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179506 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179535 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179559 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179619 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed01b774-208a-47af-9e73-08531f4d5e6c-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.179849 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.180260 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed01b774-208a-47af-9e73-08531f4d5e6c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed01b774-208a-47af-9e73-08531f4d5e6c" (UID: "ed01b774-208a-47af-9e73-08531f4d5e6c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.180328 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.181748 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-config-data\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.182214 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.182930 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.184108 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed01b774-208a-47af-9e73-08531f4d5e6c-kube-api-access-8fncw" (OuterVolumeSpecName: "kube-api-access-8fncw") pod "ed01b774-208a-47af-9e73-08531f4d5e6c" (UID: "ed01b774-208a-47af-9e73-08531f4d5e6c"). InnerVolumeSpecName "kube-api-access-8fncw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.184224 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.184571 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.193635 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.207504 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5224cc9f-d610-4ea0-94da-11cdb019dcce-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.211513 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5224cc9f-d610-4ea0-94da-11cdb019dcce-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.212152 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jvlw\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-kube-api-access-4jvlw\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.214148 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.223255 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.226208 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.229404 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.236603 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.236780 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.237064 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.237220 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.237393 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lcn8w" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.237548 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.237703 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.275253 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.280809 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed01b774-208a-47af-9e73-08531f4d5e6c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.280838 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fncw\" (UniqueName: \"kubernetes.io/projected/ed01b774-208a-47af-9e73-08531f4d5e6c-kube-api-access-8fncw\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.381724 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.381776 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjs6q\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-kube-api-access-bjs6q\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.381795 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.381832 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.381855 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.381875 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.381896 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.381915 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.382174 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.382191 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.382212 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.483216 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.483261 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.483311 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.483337 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.483357 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.483384 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.483397 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.483422 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.483465 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.483494 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjs6q\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-kube-api-access-bjs6q\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.483509 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.484453 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.484501 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.484667 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.484671 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.485198 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.485213 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.487975 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.488258 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.488577 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.490704 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.512072 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjs6q\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-kube-api-access-bjs6q\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.543938 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:24 crc kubenswrapper[4697]: I0220 16:49:24.577706 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.060976 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56b95674c9-gvkl2" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.107562 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56b95674c9-gvkl2"] Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.112591 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56b95674c9-gvkl2"] Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.636199 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.640513 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.646233 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-tp7qs" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.646592 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.647547 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.650473 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.651088 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.651671 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.707528 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/33594c24-be5d-42de-ba91-5584becb21e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.707817 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33594c24-be5d-42de-ba91-5584becb21e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.707886 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc5wn\" (UniqueName: \"kubernetes.io/projected/33594c24-be5d-42de-ba91-5584becb21e3-kube-api-access-pc5wn\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.707908 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/33594c24-be5d-42de-ba91-5584becb21e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.707945 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/33594c24-be5d-42de-ba91-5584becb21e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.707962 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.707992 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33594c24-be5d-42de-ba91-5584becb21e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.708013 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/33594c24-be5d-42de-ba91-5584becb21e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.812462 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/33594c24-be5d-42de-ba91-5584becb21e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.812722 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/33594c24-be5d-42de-ba91-5584becb21e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.812746 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.812783 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33594c24-be5d-42de-ba91-5584becb21e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.812816 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/33594c24-be5d-42de-ba91-5584becb21e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.812849 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/33594c24-be5d-42de-ba91-5584becb21e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.812875 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33594c24-be5d-42de-ba91-5584becb21e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.813127 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc5wn\" (UniqueName: \"kubernetes.io/projected/33594c24-be5d-42de-ba91-5584becb21e3-kube-api-access-pc5wn\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.814255 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/33594c24-be5d-42de-ba91-5584becb21e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.815013 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33594c24-be5d-42de-ba91-5584becb21e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.815133 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.815277 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/33594c24-be5d-42de-ba91-5584becb21e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.816086 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/33594c24-be5d-42de-ba91-5584becb21e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.821855 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33594c24-be5d-42de-ba91-5584becb21e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.838172 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/33594c24-be5d-42de-ba91-5584becb21e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.847627 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc5wn\" (UniqueName: \"kubernetes.io/projected/33594c24-be5d-42de-ba91-5584becb21e3-kube-api-access-pc5wn\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.849359 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"33594c24-be5d-42de-ba91-5584becb21e3\") " pod="openstack/openstack-galera-0" Feb 20 16:49:25 crc kubenswrapper[4697]: I0220 16:49:25.973505 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 16:49:26 crc kubenswrapper[4697]: I0220 16:49:26.886040 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed01b774-208a-47af-9e73-08531f4d5e6c" path="/var/lib/kubelet/pods/ed01b774-208a-47af-9e73-08531f4d5e6c/volumes" Feb 20 16:49:26 crc kubenswrapper[4697]: I0220 16:49:26.916521 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 16:49:26 crc kubenswrapper[4697]: I0220 16:49:26.917824 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:26 crc kubenswrapper[4697]: I0220 16:49:26.921065 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ljnkd" Feb 20 16:49:26 crc kubenswrapper[4697]: I0220 16:49:26.921839 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 20 16:49:26 crc kubenswrapper[4697]: I0220 16:49:26.922211 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 20 16:49:26 crc kubenswrapper[4697]: I0220 16:49:26.923163 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 20 16:49:26 crc kubenswrapper[4697]: I0220 16:49:26.928969 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.031260 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b33962a9-1867-4e1c-b597-d426ecf83e50-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.031330 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b33962a9-1867-4e1c-b597-d426ecf83e50-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.031374 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b33962a9-1867-4e1c-b597-d426ecf83e50-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.031398 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.031425 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b33962a9-1867-4e1c-b597-d426ecf83e50-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.031483 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cnm2\" (UniqueName: \"kubernetes.io/projected/b33962a9-1867-4e1c-b597-d426ecf83e50-kube-api-access-7cnm2\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.031504 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33962a9-1867-4e1c-b597-d426ecf83e50-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.031523 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33962a9-1867-4e1c-b597-d426ecf83e50-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.132601 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b33962a9-1867-4e1c-b597-d426ecf83e50-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.132989 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.133039 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b33962a9-1867-4e1c-b597-d426ecf83e50-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.133105 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cnm2\" (UniqueName: \"kubernetes.io/projected/b33962a9-1867-4e1c-b597-d426ecf83e50-kube-api-access-7cnm2\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.133134 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33962a9-1867-4e1c-b597-d426ecf83e50-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.133166 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33962a9-1867-4e1c-b597-d426ecf83e50-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.133199 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b33962a9-1867-4e1c-b597-d426ecf83e50-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.133277 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b33962a9-1867-4e1c-b597-d426ecf83e50-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.134193 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.135773 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b33962a9-1867-4e1c-b597-d426ecf83e50-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.150014 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b33962a9-1867-4e1c-b597-d426ecf83e50-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.154181 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33962a9-1867-4e1c-b597-d426ecf83e50-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.157639 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b33962a9-1867-4e1c-b597-d426ecf83e50-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.157957 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b33962a9-1867-4e1c-b597-d426ecf83e50-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.163827 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33962a9-1867-4e1c-b597-d426ecf83e50-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.170492 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cnm2\" (UniqueName: \"kubernetes.io/projected/b33962a9-1867-4e1c-b597-d426ecf83e50-kube-api-access-7cnm2\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.174000 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b33962a9-1867-4e1c-b597-d426ecf83e50\") " pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.187375 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.188538 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.195365 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-hl8nd" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.195736 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.196058 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.205600 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.234136 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19a9810e-52c4-4428-9fca-5bf65d100f50-kolla-config\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.234216 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsd2p\" (UniqueName: \"kubernetes.io/projected/19a9810e-52c4-4428-9fca-5bf65d100f50-kube-api-access-qsd2p\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.234284 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/19a9810e-52c4-4428-9fca-5bf65d100f50-memcached-tls-certs\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.234321 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19a9810e-52c4-4428-9fca-5bf65d100f50-combined-ca-bundle\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.234393 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19a9810e-52c4-4428-9fca-5bf65d100f50-config-data\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.246262 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.335638 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19a9810e-52c4-4428-9fca-5bf65d100f50-config-data\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.335723 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19a9810e-52c4-4428-9fca-5bf65d100f50-kolla-config\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.335808 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsd2p\" (UniqueName: \"kubernetes.io/projected/19a9810e-52c4-4428-9fca-5bf65d100f50-kube-api-access-qsd2p\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.335854 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/19a9810e-52c4-4428-9fca-5bf65d100f50-memcached-tls-certs\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.335873 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19a9810e-52c4-4428-9fca-5bf65d100f50-combined-ca-bundle\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.336912 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19a9810e-52c4-4428-9fca-5bf65d100f50-kolla-config\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.337020 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19a9810e-52c4-4428-9fca-5bf65d100f50-config-data\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.340103 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19a9810e-52c4-4428-9fca-5bf65d100f50-combined-ca-bundle\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.344955 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/19a9810e-52c4-4428-9fca-5bf65d100f50-memcached-tls-certs\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.354572 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsd2p\" (UniqueName: \"kubernetes.io/projected/19a9810e-52c4-4428-9fca-5bf65d100f50-kube-api-access-qsd2p\") pod \"memcached-0\" (UID: \"19a9810e-52c4-4428-9fca-5bf65d100f50\") " pod="openstack/memcached-0" Feb 20 16:49:27 crc kubenswrapper[4697]: I0220 16:49:27.536925 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 16:49:29 crc kubenswrapper[4697]: I0220 16:49:29.486829 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 16:49:29 crc kubenswrapper[4697]: I0220 16:49:29.490779 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 16:49:29 crc kubenswrapper[4697]: I0220 16:49:29.501487 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-kxhcl" Feb 20 16:49:29 crc kubenswrapper[4697]: I0220 16:49:29.518605 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 16:49:29 crc kubenswrapper[4697]: I0220 16:49:29.569373 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4sb2\" (UniqueName: \"kubernetes.io/projected/9e054b02-2e5f-4205-a29b-e1365412c207-kube-api-access-k4sb2\") pod \"kube-state-metrics-0\" (UID: \"9e054b02-2e5f-4205-a29b-e1365412c207\") " pod="openstack/kube-state-metrics-0" Feb 20 16:49:29 crc kubenswrapper[4697]: I0220 16:49:29.671095 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4sb2\" (UniqueName: \"kubernetes.io/projected/9e054b02-2e5f-4205-a29b-e1365412c207-kube-api-access-k4sb2\") pod \"kube-state-metrics-0\" (UID: \"9e054b02-2e5f-4205-a29b-e1365412c207\") " pod="openstack/kube-state-metrics-0" Feb 20 16:49:29 crc kubenswrapper[4697]: I0220 16:49:29.690732 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4sb2\" (UniqueName: \"kubernetes.io/projected/9e054b02-2e5f-4205-a29b-e1365412c207-kube-api-access-k4sb2\") pod \"kube-state-metrics-0\" (UID: \"9e054b02-2e5f-4205-a29b-e1365412c207\") " pod="openstack/kube-state-metrics-0" Feb 20 16:49:29 crc kubenswrapper[4697]: I0220 16:49:29.806308 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.783581 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.786133 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.787966 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-tnj5w" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.790300 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.793195 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.793779 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.794259 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.794751 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.795121 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.816865 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.817840 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.889071 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.889130 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-config\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.889158 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.889183 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.889202 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.889236 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.889263 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.889306 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.889321 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.889348 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw2pz\" (UniqueName: \"kubernetes.io/projected/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-kube-api-access-lw2pz\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.992393 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.992458 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.992499 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw2pz\" (UniqueName: \"kubernetes.io/projected/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-kube-api-access-lw2pz\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.992552 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.992579 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-config\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.992605 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.992632 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.992650 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.992688 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.992713 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.993202 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.993379 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.995147 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.997108 4697 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.997138 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ab999197bb553d62566307ca20e48f871152403b8e1c643d4e6f778eae279956/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.998073 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.999178 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:30 crc kubenswrapper[4697]: I0220 16:49:30.999632 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:31 crc kubenswrapper[4697]: I0220 16:49:31.002907 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-config\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:31 crc kubenswrapper[4697]: I0220 16:49:31.003177 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:31 crc kubenswrapper[4697]: I0220 16:49:31.020378 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw2pz\" (UniqueName: \"kubernetes.io/projected/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-kube-api-access-lw2pz\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:31 crc kubenswrapper[4697]: I0220 16:49:31.021375 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"prometheus-metric-storage-0\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:31 crc kubenswrapper[4697]: I0220 16:49:31.180275 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 16:49:31 crc kubenswrapper[4697]: I0220 16:49:31.184647 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:49:31 crc kubenswrapper[4697]: I0220 16:49:31.184718 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.816833 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-62kkp"] Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.817965 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-62kkp" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.819987 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.819999 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.820281 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-8kvtp" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.834135 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-62kkp"] Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.920166 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-7wh5h"] Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.924264 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.927103 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-combined-ca-bundle\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.927180 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-ovn-controller-tls-certs\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.927243 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-var-run-ovn\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.927284 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-scripts\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.927321 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-var-run\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.927345 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-var-log-ovn\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.927386 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-542db\" (UniqueName: \"kubernetes.io/projected/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-kube-api-access-542db\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:32 crc kubenswrapper[4697]: I0220 16:49:32.936565 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7wh5h"] Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.030495 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-scripts\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.030550 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-var-run\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.030571 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-var-log-ovn\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.031123 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-542db\" (UniqueName: \"kubernetes.io/projected/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-kube-api-access-542db\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.031142 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-var-run\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.031169 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vzcf\" (UniqueName: \"kubernetes.io/projected/be3e14f5-1877-4618-87b6-b60623792988-kube-api-access-6vzcf\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.031270 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/be3e14f5-1877-4618-87b6-b60623792988-etc-ovs\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.031309 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be3e14f5-1877-4618-87b6-b60623792988-var-run\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.031347 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-combined-ca-bundle\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.031421 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be3e14f5-1877-4618-87b6-b60623792988-var-log\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.031496 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-ovn-controller-tls-certs\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.031682 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-var-log-ovn\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.031857 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/be3e14f5-1877-4618-87b6-b60623792988-var-lib\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.032587 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3e14f5-1877-4618-87b6-b60623792988-scripts\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.032775 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-var-run-ovn\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.032954 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-var-run-ovn\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.033744 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-scripts\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.038819 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-ovn-controller-tls-certs\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.039073 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-combined-ca-bundle\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.045154 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-542db\" (UniqueName: \"kubernetes.io/projected/aa3e6a99-a9a4-4578-94c6-8a4b641405ec-kube-api-access-542db\") pod \"ovn-controller-62kkp\" (UID: \"aa3e6a99-a9a4-4578-94c6-8a4b641405ec\") " pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.134587 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vzcf\" (UniqueName: \"kubernetes.io/projected/be3e14f5-1877-4618-87b6-b60623792988-kube-api-access-6vzcf\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.134631 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/be3e14f5-1877-4618-87b6-b60623792988-etc-ovs\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.134648 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be3e14f5-1877-4618-87b6-b60623792988-var-run\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.134680 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be3e14f5-1877-4618-87b6-b60623792988-var-log\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.134710 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/be3e14f5-1877-4618-87b6-b60623792988-var-lib\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.134742 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3e14f5-1877-4618-87b6-b60623792988-scripts\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.134976 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/be3e14f5-1877-4618-87b6-b60623792988-etc-ovs\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.135073 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be3e14f5-1877-4618-87b6-b60623792988-var-run\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.135112 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be3e14f5-1877-4618-87b6-b60623792988-var-log\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.135235 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/be3e14f5-1877-4618-87b6-b60623792988-var-lib\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.136831 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3e14f5-1877-4618-87b6-b60623792988-scripts\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.139291 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-62kkp" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.150222 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vzcf\" (UniqueName: \"kubernetes.io/projected/be3e14f5-1877-4618-87b6-b60623792988-kube-api-access-6vzcf\") pod \"ovn-controller-ovs-7wh5h\" (UID: \"be3e14f5-1877-4618-87b6-b60623792988\") " pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.238039 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.473745 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.480750 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.484375 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-q6hvk" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.491719 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.492343 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.492357 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.492755 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.492879 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.541298 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b179bb2c-61ea-4bca-860b-b419bb8d3341-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.541346 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b179bb2c-61ea-4bca-860b-b419bb8d3341-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.541371 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnsrz\" (UniqueName: \"kubernetes.io/projected/b179bb2c-61ea-4bca-860b-b419bb8d3341-kube-api-access-jnsrz\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.541529 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.541649 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b179bb2c-61ea-4bca-860b-b419bb8d3341-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.541793 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b179bb2c-61ea-4bca-860b-b419bb8d3341-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.541821 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b179bb2c-61ea-4bca-860b-b419bb8d3341-config\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.541842 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b179bb2c-61ea-4bca-860b-b419bb8d3341-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.643215 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b179bb2c-61ea-4bca-860b-b419bb8d3341-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.643319 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b179bb2c-61ea-4bca-860b-b419bb8d3341-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.643345 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b179bb2c-61ea-4bca-860b-b419bb8d3341-config\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.643367 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b179bb2c-61ea-4bca-860b-b419bb8d3341-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.643579 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b179bb2c-61ea-4bca-860b-b419bb8d3341-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.643645 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b179bb2c-61ea-4bca-860b-b419bb8d3341-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.643696 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnsrz\" (UniqueName: \"kubernetes.io/projected/b179bb2c-61ea-4bca-860b-b419bb8d3341-kube-api-access-jnsrz\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.643764 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.644149 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.644275 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b179bb2c-61ea-4bca-860b-b419bb8d3341-config\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.644558 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b179bb2c-61ea-4bca-860b-b419bb8d3341-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.644672 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b179bb2c-61ea-4bca-860b-b419bb8d3341-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.646318 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b179bb2c-61ea-4bca-860b-b419bb8d3341-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.647280 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b179bb2c-61ea-4bca-860b-b419bb8d3341-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.653173 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b179bb2c-61ea-4bca-860b-b419bb8d3341-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.663010 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.663618 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnsrz\" (UniqueName: \"kubernetes.io/projected/b179bb2c-61ea-4bca-860b-b419bb8d3341-kube-api-access-jnsrz\") pod \"ovsdbserver-sb-0\" (UID: \"b179bb2c-61ea-4bca-860b-b419bb8d3341\") " pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:33 crc kubenswrapper[4697]: I0220 16:49:33.807758 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:34 crc kubenswrapper[4697]: I0220 16:49:34.134230 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 20 16:49:34 crc kubenswrapper[4697]: W0220 16:49:34.566332 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod591f7e7d_78bf_43a5_afe2_119f93765311.slice/crio-0493fb4aa1c80cf40ba159ad42e71d862e47445f14b848d5e976509e98af847c WatchSource:0}: Error finding container 0493fb4aa1c80cf40ba159ad42e71d862e47445f14b848d5e976509e98af847c: Status 404 returned error can't find the container with id 0493fb4aa1c80cf40ba159ad42e71d862e47445f14b848d5e976509e98af847c Feb 20 16:49:34 crc kubenswrapper[4697]: E0220 16:49:34.579226 4697 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.38:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 20 16:49:34 crc kubenswrapper[4697]: E0220 16:49:34.579270 4697 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.38:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 20 16:49:34 crc kubenswrapper[4697]: E0220 16:49:34.579378 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.38:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tgs68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7b4cc7db87-7bvg8_openstack(b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 16:49:34 crc kubenswrapper[4697]: E0220 16:49:34.580690 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" podUID="b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0" Feb 20 16:49:34 crc kubenswrapper[4697]: E0220 16:49:34.644792 4697 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.38:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 20 16:49:34 crc kubenswrapper[4697]: E0220 16:49:34.644898 4697 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.38:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 20 16:49:34 crc kubenswrapper[4697]: E0220 16:49:34.645074 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.38:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vwfvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-68cf5d8cfc-jjkq6_openstack(d8193e90-bf7e-4b08-8313-267825fd94a1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 16:49:34 crc kubenswrapper[4697]: E0220 16:49:34.646394 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" podUID="d8193e90-bf7e-4b08-8313-267825fd94a1" Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.100884 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d69579867-w4bmk"] Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.112335 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.149211 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d69579867-w4bmk" event={"ID":"b6840114-f1c7-41d8-a497-7401c93a5b5c","Type":"ContainerStarted","Data":"045a99a37beac5b9d3c9ec8c9df3cac05f0003cb55c7fe60e5bd2785fe9eacb6"} Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.150183 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"591f7e7d-78bf-43a5-afe2-119f93765311","Type":"ContainerStarted","Data":"0493fb4aa1c80cf40ba159ad42e71d862e47445f14b848d5e976509e98af847c"} Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.162726 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"33594c24-be5d-42de-ba91-5584becb21e3","Type":"ContainerStarted","Data":"b63213cef0c28ddcac5dadbbfe2a713db514a3b02370452524c37e78232dea94"} Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.350683 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.472050 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.481256 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.593252 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-62kkp"] Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.609315 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.617626 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b44f8bbb5-5v6fh"] Feb 20 16:49:35 crc kubenswrapper[4697]: W0220 16:49:35.704594 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafbfdf08_d446_4739_b771_0244cb6001d9.slice/crio-9d0afd6a96239f57fca698856bac7c21be94a75247903d3458015219bd033e75 WatchSource:0}: Error finding container 9d0afd6a96239f57fca698856bac7c21be94a75247903d3458015219bd033e75: Status 404 returned error can't find the container with id 9d0afd6a96239f57fca698856bac7c21be94a75247903d3458015219bd033e75 Feb 20 16:49:35 crc kubenswrapper[4697]: W0220 16:49:35.752100 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa3e6a99_a9a4_4578_94c6_8a4b641405ec.slice/crio-234679b38e6d982d6d03a07c795c954eba2171c2cee71f2c79a1342518f431d6 WatchSource:0}: Error finding container 234679b38e6d982d6d03a07c795c954eba2171c2cee71f2c79a1342518f431d6: Status 404 returned error can't find the container with id 234679b38e6d982d6d03a07c795c954eba2171c2cee71f2c79a1342518f431d6 Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.760792 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.772012 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.772117 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7wh5h"] Feb 20 16:49:35 crc kubenswrapper[4697]: W0220 16:49:35.799095 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe3e14f5_1877_4618_87b6_b60623792988.slice/crio-dcdf16f7146e5de714895a63594b67aa69f5404e8022d4f41573b6b930d25156 WatchSource:0}: Error finding container dcdf16f7146e5de714895a63594b67aa69f5404e8022d4f41573b6b930d25156: Status 404 returned error can't find the container with id dcdf16f7146e5de714895a63594b67aa69f5404e8022d4f41573b6b930d25156 Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.893499 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8193e90-bf7e-4b08-8313-267825fd94a1-config\") pod \"d8193e90-bf7e-4b08-8313-267825fd94a1\" (UID: \"d8193e90-bf7e-4b08-8313-267825fd94a1\") " Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.893614 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwfvr\" (UniqueName: \"kubernetes.io/projected/d8193e90-bf7e-4b08-8313-267825fd94a1-kube-api-access-vwfvr\") pod \"d8193e90-bf7e-4b08-8313-267825fd94a1\" (UID: \"d8193e90-bf7e-4b08-8313-267825fd94a1\") " Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.893689 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8193e90-bf7e-4b08-8313-267825fd94a1-dns-svc\") pod \"d8193e90-bf7e-4b08-8313-267825fd94a1\" (UID: \"d8193e90-bf7e-4b08-8313-267825fd94a1\") " Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.893741 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgs68\" (UniqueName: \"kubernetes.io/projected/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0-kube-api-access-tgs68\") pod \"b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0\" (UID: \"b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0\") " Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.893779 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0-config\") pod \"b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0\" (UID: \"b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0\") " Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.894275 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8193e90-bf7e-4b08-8313-267825fd94a1-config" (OuterVolumeSpecName: "config") pod "d8193e90-bf7e-4b08-8313-267825fd94a1" (UID: "d8193e90-bf7e-4b08-8313-267825fd94a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.894965 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0-config" (OuterVolumeSpecName: "config") pod "b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0" (UID: "b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.895298 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.895331 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8193e90-bf7e-4b08-8313-267825fd94a1-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.896290 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8193e90-bf7e-4b08-8313-267825fd94a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8193e90-bf7e-4b08-8313-267825fd94a1" (UID: "d8193e90-bf7e-4b08-8313-267825fd94a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.898689 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0-kube-api-access-tgs68" (OuterVolumeSpecName: "kube-api-access-tgs68") pod "b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0" (UID: "b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0"). InnerVolumeSpecName "kube-api-access-tgs68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.901753 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.904793 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8193e90-bf7e-4b08-8313-267825fd94a1-kube-api-access-vwfvr" (OuterVolumeSpecName: "kube-api-access-vwfvr") pod "d8193e90-bf7e-4b08-8313-267825fd94a1" (UID: "d8193e90-bf7e-4b08-8313-267825fd94a1"). InnerVolumeSpecName "kube-api-access-vwfvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.907426 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 16:49:35 crc kubenswrapper[4697]: W0220 16:49:35.918716 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19a9810e_52c4_4428_9fca_5bf65d100f50.slice/crio-8c8e1d4d641d99e6ef4f7084eba892dc9cf71102cf955d22efc4be7448463cf6 WatchSource:0}: Error finding container 8c8e1d4d641d99e6ef4f7084eba892dc9cf71102cf955d22efc4be7448463cf6: Status 404 returned error can't find the container with id 8c8e1d4d641d99e6ef4f7084eba892dc9cf71102cf955d22efc4be7448463cf6 Feb 20 16:49:35 crc kubenswrapper[4697]: W0220 16:49:35.920096 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e054b02_2e5f_4205_a29b_e1365412c207.slice/crio-9f0ed8ebb271460b28518b7d496b1ec1f07b2195bb59b3954c35d0850d3e8929 WatchSource:0}: Error finding container 9f0ed8ebb271460b28518b7d496b1ec1f07b2195bb59b3954c35d0850d3e8929: Status 404 returned error can't find the container with id 9f0ed8ebb271460b28518b7d496b1ec1f07b2195bb59b3954c35d0850d3e8929 Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.974928 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.997014 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwfvr\" (UniqueName: \"kubernetes.io/projected/d8193e90-bf7e-4b08-8313-267825fd94a1-kube-api-access-vwfvr\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.997045 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8193e90-bf7e-4b08-8313-267825fd94a1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:35 crc kubenswrapper[4697]: I0220 16:49:35.997054 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgs68\" (UniqueName: \"kubernetes.io/projected/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0-kube-api-access-tgs68\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:36 crc kubenswrapper[4697]: W0220 16:49:36.023341 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb179bb2c_61ea_4bca_860b_b419bb8d3341.slice/crio-3c9e894329641f6359ea192c8edfa200b293954e323086aab80f6ee30bac69c1 WatchSource:0}: Error finding container 3c9e894329641f6359ea192c8edfa200b293954e323086aab80f6ee30bac69c1: Status 404 returned error can't find the container with id 3c9e894329641f6359ea192c8edfa200b293954e323086aab80f6ee30bac69c1 Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.171087 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b179bb2c-61ea-4bca-860b-b419bb8d3341","Type":"ContainerStarted","Data":"3c9e894329641f6359ea192c8edfa200b293954e323086aab80f6ee30bac69c1"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.173079 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5224cc9f-d610-4ea0-94da-11cdb019dcce","Type":"ContainerStarted","Data":"149f6e09cca44ea0851008e6e2eee087d7dc51b26598c0b4fe8bd8fc9e577edb"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.178128 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"19a9810e-52c4-4428-9fca-5bf65d100f50","Type":"ContainerStarted","Data":"8c8e1d4d641d99e6ef4f7084eba892dc9cf71102cf955d22efc4be7448463cf6"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.180135 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"40ca67b4-1eb6-40a6-ad33-1982ed83eb63","Type":"ContainerStarted","Data":"4031a5ea37ad4e319c7d2cb4d52b2422257aba8e507bbff9fafa3d8901f117bf"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.181073 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-62kkp" event={"ID":"aa3e6a99-a9a4-4578-94c6-8a4b641405ec","Type":"ContainerStarted","Data":"234679b38e6d982d6d03a07c795c954eba2171c2cee71f2c79a1342518f431d6"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.182194 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" event={"ID":"d8193e90-bf7e-4b08-8313-267825fd94a1","Type":"ContainerDied","Data":"df6f6fae13523092b8a46e276d873f8a39189810f4cdaa01bb6d343ac2d9d326"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.182253 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68cf5d8cfc-jjkq6" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.196974 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9e054b02-2e5f-4205-a29b-e1365412c207","Type":"ContainerStarted","Data":"9f0ed8ebb271460b28518b7d496b1ec1f07b2195bb59b3954c35d0850d3e8929"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.198672 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c08b5df3-217d-41d0-b021-d29a0b7e7dd2","Type":"ContainerStarted","Data":"6f6d21fccb187ea83597da718508115bf7743842b00388bdb3c4e037695256dd"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.202503 4697 generic.go:334] "Generic (PLEG): container finished" podID="b6840114-f1c7-41d8-a497-7401c93a5b5c" containerID="3e100501e8b4bba5f4909896fecb235daf8b943a55369ad2e610cfefbd94fb7c" exitCode=0 Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.202603 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d69579867-w4bmk" event={"ID":"b6840114-f1c7-41d8-a497-7401c93a5b5c","Type":"ContainerDied","Data":"3e100501e8b4bba5f4909896fecb235daf8b943a55369ad2e610cfefbd94fb7c"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.204652 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7wh5h" event={"ID":"be3e14f5-1877-4618-87b6-b60623792988","Type":"ContainerStarted","Data":"dcdf16f7146e5de714895a63594b67aa69f5404e8022d4f41573b6b930d25156"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.207155 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b33962a9-1867-4e1c-b597-d426ecf83e50","Type":"ContainerStarted","Data":"167a4a89dbc74b580fe763133d79aa3cb304d675c78455afa511355624bb057d"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.236753 4697 generic.go:334] "Generic (PLEG): container finished" podID="afbfdf08-d446-4739-b771-0244cb6001d9" containerID="335864a5710161af316b9847516e428794469603a90c5a170802e7cd4bb18bfa" exitCode=0 Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.237074 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" event={"ID":"afbfdf08-d446-4739-b771-0244cb6001d9","Type":"ContainerDied","Data":"335864a5710161af316b9847516e428794469603a90c5a170802e7cd4bb18bfa"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.237100 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" event={"ID":"afbfdf08-d446-4739-b771-0244cb6001d9","Type":"ContainerStarted","Data":"9d0afd6a96239f57fca698856bac7c21be94a75247903d3458015219bd033e75"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.251222 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" event={"ID":"b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0","Type":"ContainerDied","Data":"66cf3903b0f26b4c9a9aeecff6891c5ea5553332b6175d45776c351c6e63ca25"} Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.251298 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b4cc7db87-7bvg8" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.292411 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68cf5d8cfc-jjkq6"] Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.300379 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68cf5d8cfc-jjkq6"] Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.367132 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b4cc7db87-7bvg8"] Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.377323 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b4cc7db87-7bvg8"] Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.386702 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.388267 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.391828 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-p4l6c" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.392083 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.392451 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.392871 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.396644 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.507849 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.507956 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.508182 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.508289 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.508339 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-config\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.508387 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.508786 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.508807 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llrt8\" (UniqueName: \"kubernetes.io/projected/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-kube-api-access-llrt8\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: E0220 16:49:36.517509 4697 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 20 16:49:36 crc kubenswrapper[4697]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b6840114-f1c7-41d8-a497-7401c93a5b5c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 20 16:49:36 crc kubenswrapper[4697]: > podSandboxID="045a99a37beac5b9d3c9ec8c9df3cac05f0003cb55c7fe60e5bd2785fe9eacb6" Feb 20 16:49:36 crc kubenswrapper[4697]: E0220 16:49:36.517732 4697 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 16:49:36 crc kubenswrapper[4697]: container &Container{Name:dnsmasq-dns,Image:38.102.83.38:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n684h65fh56h6fh87h85h57h76h5b7h94hffh649hfbh8ch5bch56fh5c5hbh86hf9h99h5dch95h66hd5h555h566h646h546h79h9dh55dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sr6gq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d69579867-w4bmk_openstack(b6840114-f1c7-41d8-a497-7401c93a5b5c): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b6840114-f1c7-41d8-a497-7401c93a5b5c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 20 16:49:36 crc kubenswrapper[4697]: > logger="UnhandledError" Feb 20 16:49:36 crc kubenswrapper[4697]: E0220 16:49:36.536517 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b6840114-f1c7-41d8-a497-7401c93a5b5c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5d69579867-w4bmk" podUID="b6840114-f1c7-41d8-a497-7401c93a5b5c" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.610219 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.610278 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.610316 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.610338 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-config\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.610364 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.610419 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.610453 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llrt8\" (UniqueName: \"kubernetes.io/projected/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-kube-api-access-llrt8\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.610471 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.611624 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-config\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.611871 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.612175 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.612667 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.618072 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.618334 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.620360 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.636054 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llrt8\" (UniqueName: \"kubernetes.io/projected/fb5d6748-960a-41d3-a11e-6dd21c3dd46f-kube-api-access-llrt8\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.645370 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fb5d6748-960a-41d3-a11e-6dd21c3dd46f\") " pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.710988 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.899326 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0" path="/var/lib/kubelet/pods/b8fbd3ad-9df8-4a65-b37b-e4b63374c9f0/volumes" Feb 20 16:49:36 crc kubenswrapper[4697]: I0220 16:49:36.900588 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8193e90-bf7e-4b08-8313-267825fd94a1" path="/var/lib/kubelet/pods/d8193e90-bf7e-4b08-8313-267825fd94a1/volumes" Feb 20 16:49:37 crc kubenswrapper[4697]: I0220 16:49:37.265060 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" event={"ID":"afbfdf08-d446-4739-b771-0244cb6001d9","Type":"ContainerStarted","Data":"535212505c095546a59f4d325dbc916a41c1820cff5dabab7db4fcffc07c8b46"} Feb 20 16:49:37 crc kubenswrapper[4697]: I0220 16:49:37.265407 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:37 crc kubenswrapper[4697]: I0220 16:49:37.284144 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" podStartSLOduration=14.284126029 podStartE2EDuration="14.284126029s" podCreationTimestamp="2026-02-20 16:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:49:37.280618461 +0000 UTC m=+1085.060663859" watchObservedRunningTime="2026-02-20 16:49:37.284126029 +0000 UTC m=+1085.064171437" Feb 20 16:49:37 crc kubenswrapper[4697]: I0220 16:49:37.364116 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 16:49:38 crc kubenswrapper[4697]: I0220 16:49:38.274289 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fb5d6748-960a-41d3-a11e-6dd21c3dd46f","Type":"ContainerStarted","Data":"95574daa7233a0a77d3561f504f0b14f93b044576756b2f127661c4343ef7390"} Feb 20 16:49:43 crc kubenswrapper[4697]: I0220 16:49:43.429601 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:49:43 crc kubenswrapper[4697]: I0220 16:49:43.487551 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d69579867-w4bmk"] Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.349228 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b179bb2c-61ea-4bca-860b-b419bb8d3341","Type":"ContainerStarted","Data":"5ad128c5b8f06ffb55522b02476396cad1cc9db9bf06c60ea2074b242671e8d3"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.350844 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5224cc9f-d610-4ea0-94da-11cdb019dcce","Type":"ContainerStarted","Data":"3f293a48d38b7b781163e876ccf67bbca965ff9663c9453e12702b566695a6d3"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.352560 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"19a9810e-52c4-4428-9fca-5bf65d100f50","Type":"ContainerStarted","Data":"e5f60638ae26d54c71b77faee8e33773c4b269fdfcb78827310a344e5f4e29b8"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.352643 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.355047 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"591f7e7d-78bf-43a5-afe2-119f93765311","Type":"ContainerStarted","Data":"e8d6ccf2b09ae17c606154f3a9a7311ffe59e59a1a51f1eaf9fdbb26063850a3"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.356463 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7wh5h" event={"ID":"be3e14f5-1877-4618-87b6-b60623792988","Type":"ContainerStarted","Data":"ad3f1a9341ff167bcf7a8daf15d50e84639ddfe53ba5f4e9e30df126918130f8"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.358865 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fb5d6748-960a-41d3-a11e-6dd21c3dd46f","Type":"ContainerStarted","Data":"fb4305404fbacaadf905fe8fafc158c015b269d4fafb5a523ae6fc4b5b41b9b1"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.362134 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"40ca67b4-1eb6-40a6-ad33-1982ed83eb63","Type":"ContainerStarted","Data":"0cc80e2fc351be7479162c4f999b086841bc3cf0512d270e2d2a9e8c622f67e6"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.366731 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"33594c24-be5d-42de-ba91-5584becb21e3","Type":"ContainerStarted","Data":"adac9584f6f120b9cd938150ee23c05eb3e39e09ec6e7d52619c277d4f3bbf46"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.369162 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9e054b02-2e5f-4205-a29b-e1365412c207","Type":"ContainerStarted","Data":"1460807ae0756fc1a012f435cf98d8e80d1d937277413601e95464f9c5149bcd"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.369604 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.380985 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-62kkp" event={"ID":"aa3e6a99-a9a4-4578-94c6-8a4b641405ec","Type":"ContainerStarted","Data":"929216881c5ef55a04e3b8c9120f9d2be2a548399d6e7aa4df7e081e4ba42670"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.381097 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-62kkp" Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.387179 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c08b5df3-217d-41d0-b021-d29a0b7e7dd2","Type":"ContainerStarted","Data":"c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.392365 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d69579867-w4bmk" event={"ID":"b6840114-f1c7-41d8-a497-7401c93a5b5c","Type":"ContainerStarted","Data":"a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.392538 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d69579867-w4bmk" podUID="b6840114-f1c7-41d8-a497-7401c93a5b5c" containerName="dnsmasq-dns" containerID="cri-o://a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb" gracePeriod=10 Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.392762 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.395225 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b33962a9-1867-4e1c-b597-d426ecf83e50","Type":"ContainerStarted","Data":"87fb75c0477db7a4f4be6b9a7ec6f06296685ed93793645296f605d5cfd8174c"} Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.403945 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.708166289 podStartE2EDuration="21.403929477s" podCreationTimestamp="2026-02-20 16:49:27 +0000 UTC" firstStartedPulling="2026-02-20 16:49:35.921389232 +0000 UTC m=+1083.701434630" lastFinishedPulling="2026-02-20 16:49:42.61715241 +0000 UTC m=+1090.397197818" observedRunningTime="2026-02-20 16:49:48.402671466 +0000 UTC m=+1096.182716884" watchObservedRunningTime="2026-02-20 16:49:48.403929477 +0000 UTC m=+1096.183974885" Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.440042 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.424736092 podStartE2EDuration="19.440027753s" podCreationTimestamp="2026-02-20 16:49:29 +0000 UTC" firstStartedPulling="2026-02-20 16:49:35.927158767 +0000 UTC m=+1083.707204175" lastFinishedPulling="2026-02-20 16:49:43.942450428 +0000 UTC m=+1091.722495836" observedRunningTime="2026-02-20 16:49:48.43789187 +0000 UTC m=+1096.217937298" watchObservedRunningTime="2026-02-20 16:49:48.440027753 +0000 UTC m=+1096.220073161" Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.547013 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-62kkp" podStartSLOduration=9.230276117 podStartE2EDuration="16.546994778s" podCreationTimestamp="2026-02-20 16:49:32 +0000 UTC" firstStartedPulling="2026-02-20 16:49:35.773100381 +0000 UTC m=+1083.553145789" lastFinishedPulling="2026-02-20 16:49:43.089819042 +0000 UTC m=+1090.869864450" observedRunningTime="2026-02-20 16:49:48.540826663 +0000 UTC m=+1096.320872091" watchObservedRunningTime="2026-02-20 16:49:48.546994778 +0000 UTC m=+1096.327040186" Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.833819 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.911570 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr6gq\" (UniqueName: \"kubernetes.io/projected/b6840114-f1c7-41d8-a497-7401c93a5b5c-kube-api-access-sr6gq\") pod \"b6840114-f1c7-41d8-a497-7401c93a5b5c\" (UID: \"b6840114-f1c7-41d8-a497-7401c93a5b5c\") " Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.911659 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6840114-f1c7-41d8-a497-7401c93a5b5c-config\") pod \"b6840114-f1c7-41d8-a497-7401c93a5b5c\" (UID: \"b6840114-f1c7-41d8-a497-7401c93a5b5c\") " Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.911707 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6840114-f1c7-41d8-a497-7401c93a5b5c-dns-svc\") pod \"b6840114-f1c7-41d8-a497-7401c93a5b5c\" (UID: \"b6840114-f1c7-41d8-a497-7401c93a5b5c\") " Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.919203 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6840114-f1c7-41d8-a497-7401c93a5b5c-kube-api-access-sr6gq" (OuterVolumeSpecName: "kube-api-access-sr6gq") pod "b6840114-f1c7-41d8-a497-7401c93a5b5c" (UID: "b6840114-f1c7-41d8-a497-7401c93a5b5c"). InnerVolumeSpecName "kube-api-access-sr6gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.947968 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6840114-f1c7-41d8-a497-7401c93a5b5c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6840114-f1c7-41d8-a497-7401c93a5b5c" (UID: "b6840114-f1c7-41d8-a497-7401c93a5b5c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:49:48 crc kubenswrapper[4697]: I0220 16:49:48.960665 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6840114-f1c7-41d8-a497-7401c93a5b5c-config" (OuterVolumeSpecName: "config") pod "b6840114-f1c7-41d8-a497-7401c93a5b5c" (UID: "b6840114-f1c7-41d8-a497-7401c93a5b5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.013195 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr6gq\" (UniqueName: \"kubernetes.io/projected/b6840114-f1c7-41d8-a497-7401c93a5b5c-kube-api-access-sr6gq\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.013223 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6840114-f1c7-41d8-a497-7401c93a5b5c-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.013232 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6840114-f1c7-41d8-a497-7401c93a5b5c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.404486 4697 generic.go:334] "Generic (PLEG): container finished" podID="b6840114-f1c7-41d8-a497-7401c93a5b5c" containerID="a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb" exitCode=0 Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.404529 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d69579867-w4bmk" event={"ID":"b6840114-f1c7-41d8-a497-7401c93a5b5c","Type":"ContainerDied","Data":"a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb"} Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.404601 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d69579867-w4bmk" Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.404618 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d69579867-w4bmk" event={"ID":"b6840114-f1c7-41d8-a497-7401c93a5b5c","Type":"ContainerDied","Data":"045a99a37beac5b9d3c9ec8c9df3cac05f0003cb55c7fe60e5bd2785fe9eacb6"} Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.404657 4697 scope.go:117] "RemoveContainer" containerID="a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb" Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.407062 4697 generic.go:334] "Generic (PLEG): container finished" podID="be3e14f5-1877-4618-87b6-b60623792988" containerID="ad3f1a9341ff167bcf7a8daf15d50e84639ddfe53ba5f4e9e30df126918130f8" exitCode=0 Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.408453 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7wh5h" event={"ID":"be3e14f5-1877-4618-87b6-b60623792988","Type":"ContainerDied","Data":"ad3f1a9341ff167bcf7a8daf15d50e84639ddfe53ba5f4e9e30df126918130f8"} Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.448383 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d69579867-w4bmk"] Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.458389 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d69579867-w4bmk"] Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.627799 4697 scope.go:117] "RemoveContainer" containerID="3e100501e8b4bba5f4909896fecb235daf8b943a55369ad2e610cfefbd94fb7c" Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.691006 4697 scope.go:117] "RemoveContainer" containerID="a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb" Feb 20 16:49:49 crc kubenswrapper[4697]: E0220 16:49:49.691546 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb\": container with ID starting with a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb not found: ID does not exist" containerID="a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb" Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.691583 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb"} err="failed to get container status \"a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb\": rpc error: code = NotFound desc = could not find container \"a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb\": container with ID starting with a6e8082d287454d7f4cbe18c337d052e6fe007ed1777c52376b89657e463bbcb not found: ID does not exist" Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.691608 4697 scope.go:117] "RemoveContainer" containerID="3e100501e8b4bba5f4909896fecb235daf8b943a55369ad2e610cfefbd94fb7c" Feb 20 16:49:49 crc kubenswrapper[4697]: E0220 16:49:49.691961 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e100501e8b4bba5f4909896fecb235daf8b943a55369ad2e610cfefbd94fb7c\": container with ID starting with 3e100501e8b4bba5f4909896fecb235daf8b943a55369ad2e610cfefbd94fb7c not found: ID does not exist" containerID="3e100501e8b4bba5f4909896fecb235daf8b943a55369ad2e610cfefbd94fb7c" Feb 20 16:49:49 crc kubenswrapper[4697]: I0220 16:49:49.691992 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e100501e8b4bba5f4909896fecb235daf8b943a55369ad2e610cfefbd94fb7c"} err="failed to get container status \"3e100501e8b4bba5f4909896fecb235daf8b943a55369ad2e610cfefbd94fb7c\": rpc error: code = NotFound desc = could not find container \"3e100501e8b4bba5f4909896fecb235daf8b943a55369ad2e610cfefbd94fb7c\": container with ID starting with 3e100501e8b4bba5f4909896fecb235daf8b943a55369ad2e610cfefbd94fb7c not found: ID does not exist" Feb 20 16:49:50 crc kubenswrapper[4697]: I0220 16:49:50.416470 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b179bb2c-61ea-4bca-860b-b419bb8d3341","Type":"ContainerStarted","Data":"3a42485bbe04bf8e160ed9ecba228fb2f09a3beea2ad3e0e5a33590a7139f82b"} Feb 20 16:49:50 crc kubenswrapper[4697]: I0220 16:49:50.418995 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7wh5h" event={"ID":"be3e14f5-1877-4618-87b6-b60623792988","Type":"ContainerStarted","Data":"024bde30929a8c6df44ffb54220dd65f69b106e418cc68b912ce92e7e7f97630"} Feb 20 16:49:50 crc kubenswrapper[4697]: I0220 16:49:50.419237 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:50 crc kubenswrapper[4697]: I0220 16:49:50.419249 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:49:50 crc kubenswrapper[4697]: I0220 16:49:50.419258 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7wh5h" event={"ID":"be3e14f5-1877-4618-87b6-b60623792988","Type":"ContainerStarted","Data":"56d0f4636d4d05ec684987279e9390f99d5662509b6c82741580b13541f6cc5e"} Feb 20 16:49:50 crc kubenswrapper[4697]: I0220 16:49:50.421001 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fb5d6748-960a-41d3-a11e-6dd21c3dd46f","Type":"ContainerStarted","Data":"3a12cc67e0931d3793f5e248caa33ec2eb2894ad090c18a2e3f4708d91c03985"} Feb 20 16:49:50 crc kubenswrapper[4697]: I0220 16:49:50.432989 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.727523172 podStartE2EDuration="18.432973756s" podCreationTimestamp="2026-02-20 16:49:32 +0000 UTC" firstStartedPulling="2026-02-20 16:49:36.027546826 +0000 UTC m=+1083.807592234" lastFinishedPulling="2026-02-20 16:49:49.73299741 +0000 UTC m=+1097.513042818" observedRunningTime="2026-02-20 16:49:50.430881734 +0000 UTC m=+1098.210927152" watchObservedRunningTime="2026-02-20 16:49:50.432973756 +0000 UTC m=+1098.213019164" Feb 20 16:49:50 crc kubenswrapper[4697]: I0220 16:49:50.451557 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.173081367 podStartE2EDuration="15.451539392s" podCreationTimestamp="2026-02-20 16:49:35 +0000 UTC" firstStartedPulling="2026-02-20 16:49:37.448112154 +0000 UTC m=+1085.228157562" lastFinishedPulling="2026-02-20 16:49:49.726570179 +0000 UTC m=+1097.506615587" observedRunningTime="2026-02-20 16:49:50.445688035 +0000 UTC m=+1098.225733443" watchObservedRunningTime="2026-02-20 16:49:50.451539392 +0000 UTC m=+1098.231584810" Feb 20 16:49:50 crc kubenswrapper[4697]: I0220 16:49:50.472909 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-7wh5h" podStartSLOduration=11.364268509 podStartE2EDuration="18.472889118s" podCreationTimestamp="2026-02-20 16:49:32 +0000 UTC" firstStartedPulling="2026-02-20 16:49:35.802749675 +0000 UTC m=+1083.582795083" lastFinishedPulling="2026-02-20 16:49:42.911370284 +0000 UTC m=+1090.691415692" observedRunningTime="2026-02-20 16:49:50.466326263 +0000 UTC m=+1098.246371681" watchObservedRunningTime="2026-02-20 16:49:50.472889118 +0000 UTC m=+1098.252934546" Feb 20 16:49:50 crc kubenswrapper[4697]: I0220 16:49:50.889645 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6840114-f1c7-41d8-a497-7401c93a5b5c" path="/var/lib/kubelet/pods/b6840114-f1c7-41d8-a497-7401c93a5b5c/volumes" Feb 20 16:49:51 crc kubenswrapper[4697]: I0220 16:49:51.711774 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:51 crc kubenswrapper[4697]: I0220 16:49:51.712028 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:51 crc kubenswrapper[4697]: I0220 16:49:51.749023 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:51 crc kubenswrapper[4697]: I0220 16:49:51.808344 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:51 crc kubenswrapper[4697]: I0220 16:49:51.849882 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.435557 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.471707 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.472630 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.538627 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.734770 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66bdd8b645-qjlbf"] Feb 20 16:49:52 crc kubenswrapper[4697]: E0220 16:49:52.735085 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6840114-f1c7-41d8-a497-7401c93a5b5c" containerName="init" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.735097 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6840114-f1c7-41d8-a497-7401c93a5b5c" containerName="init" Feb 20 16:49:52 crc kubenswrapper[4697]: E0220 16:49:52.735122 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6840114-f1c7-41d8-a497-7401c93a5b5c" containerName="dnsmasq-dns" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.735128 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6840114-f1c7-41d8-a497-7401c93a5b5c" containerName="dnsmasq-dns" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.735323 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6840114-f1c7-41d8-a497-7401c93a5b5c" containerName="dnsmasq-dns" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.736268 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.742497 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.766097 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66bdd8b645-qjlbf"] Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.778531 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-ovsdbserver-nb\") pod \"dnsmasq-dns-66bdd8b645-qjlbf\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.778597 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-dns-svc\") pod \"dnsmasq-dns-66bdd8b645-qjlbf\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.778697 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-config\") pod \"dnsmasq-dns-66bdd8b645-qjlbf\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.778725 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq8t9\" (UniqueName: \"kubernetes.io/projected/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-kube-api-access-dq8t9\") pod \"dnsmasq-dns-66bdd8b645-qjlbf\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.824044 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-wrhqd"] Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.825113 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.831105 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.838740 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wrhqd"] Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.880918 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/650a1475-c144-4a21-a156-9859cb1418d4-ovs-rundir\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.880975 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/650a1475-c144-4a21-a156-9859cb1418d4-config\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.881027 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-config\") pod \"dnsmasq-dns-66bdd8b645-qjlbf\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.881050 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq8t9\" (UniqueName: \"kubernetes.io/projected/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-kube-api-access-dq8t9\") pod \"dnsmasq-dns-66bdd8b645-qjlbf\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.881194 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tkmr\" (UniqueName: \"kubernetes.io/projected/650a1475-c144-4a21-a156-9859cb1418d4-kube-api-access-5tkmr\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.881255 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/650a1475-c144-4a21-a156-9859cb1418d4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.881316 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/650a1475-c144-4a21-a156-9859cb1418d4-ovn-rundir\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.881355 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-ovsdbserver-nb\") pod \"dnsmasq-dns-66bdd8b645-qjlbf\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.881372 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650a1475-c144-4a21-a156-9859cb1418d4-combined-ca-bundle\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.881401 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-dns-svc\") pod \"dnsmasq-dns-66bdd8b645-qjlbf\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.882171 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-config\") pod \"dnsmasq-dns-66bdd8b645-qjlbf\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.882185 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-dns-svc\") pod \"dnsmasq-dns-66bdd8b645-qjlbf\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.882280 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-ovsdbserver-nb\") pod \"dnsmasq-dns-66bdd8b645-qjlbf\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.900791 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.902179 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.914684 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.914856 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.914888 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.915730 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-jhhpp" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.916312 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66bdd8b645-qjlbf"] Feb 20 16:49:52 crc kubenswrapper[4697]: E0220 16:49:52.916885 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-dq8t9], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" podUID="eb6527d7-52ee-4cda-be40-5df08f3f4cd9" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.939634 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq8t9\" (UniqueName: \"kubernetes.io/projected/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-kube-api-access-dq8t9\") pod \"dnsmasq-dns-66bdd8b645-qjlbf\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.969263 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.983925 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v2vj\" (UniqueName: \"kubernetes.io/projected/00989e93-9419-47e9-a3ba-7b5e65910be9-kube-api-access-9v2vj\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.983990 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/650a1475-c144-4a21-a156-9859cb1418d4-ovs-rundir\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.984064 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00989e93-9419-47e9-a3ba-7b5e65910be9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.984105 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00989e93-9419-47e9-a3ba-7b5e65910be9-scripts\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.984124 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00989e93-9419-47e9-a3ba-7b5e65910be9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.984170 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00989e93-9419-47e9-a3ba-7b5e65910be9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.984194 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/650a1475-c144-4a21-a156-9859cb1418d4-config\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.984386 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00989e93-9419-47e9-a3ba-7b5e65910be9-config\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.984425 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tkmr\" (UniqueName: \"kubernetes.io/projected/650a1475-c144-4a21-a156-9859cb1418d4-kube-api-access-5tkmr\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.984492 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/650a1475-c144-4a21-a156-9859cb1418d4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.984555 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/650a1475-c144-4a21-a156-9859cb1418d4-ovn-rundir\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.984580 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00989e93-9419-47e9-a3ba-7b5e65910be9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.984608 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650a1475-c144-4a21-a156-9859cb1418d4-combined-ca-bundle\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.985617 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/650a1475-c144-4a21-a156-9859cb1418d4-ovs-rundir\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.987770 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/650a1475-c144-4a21-a156-9859cb1418d4-config\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.990790 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/650a1475-c144-4a21-a156-9859cb1418d4-ovn-rundir\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:52 crc kubenswrapper[4697]: I0220 16:49:52.994037 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650a1475-c144-4a21-a156-9859cb1418d4-combined-ca-bundle\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.000479 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/650a1475-c144-4a21-a156-9859cb1418d4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.026556 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tkmr\" (UniqueName: \"kubernetes.io/projected/650a1475-c144-4a21-a156-9859cb1418d4-kube-api-access-5tkmr\") pod \"ovn-controller-metrics-wrhqd\" (UID: \"650a1475-c144-4a21-a156-9859cb1418d4\") " pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.039266 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f569ddf-7qd7f"] Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.041010 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.043847 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.045903 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f569ddf-7qd7f"] Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.087677 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-dns-svc\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.088196 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-config\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.088817 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00989e93-9419-47e9-a3ba-7b5e65910be9-config\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.089477 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-ovsdbserver-sb\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.089954 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8m2v\" (UniqueName: \"kubernetes.io/projected/95de2443-326d-455a-90d3-b71e616ffdd4-kube-api-access-m8m2v\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.090248 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00989e93-9419-47e9-a3ba-7b5e65910be9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.090790 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-ovsdbserver-nb\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.091024 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2vj\" (UniqueName: \"kubernetes.io/projected/00989e93-9419-47e9-a3ba-7b5e65910be9-kube-api-access-9v2vj\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.091161 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00989e93-9419-47e9-a3ba-7b5e65910be9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.091360 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00989e93-9419-47e9-a3ba-7b5e65910be9-scripts\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.091500 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00989e93-9419-47e9-a3ba-7b5e65910be9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.092267 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00989e93-9419-47e9-a3ba-7b5e65910be9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.092196 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00989e93-9419-47e9-a3ba-7b5e65910be9-scripts\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.091326 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00989e93-9419-47e9-a3ba-7b5e65910be9-config\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.092658 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00989e93-9419-47e9-a3ba-7b5e65910be9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.093346 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00989e93-9419-47e9-a3ba-7b5e65910be9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.094562 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00989e93-9419-47e9-a3ba-7b5e65910be9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.094706 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00989e93-9419-47e9-a3ba-7b5e65910be9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.109490 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v2vj\" (UniqueName: \"kubernetes.io/projected/00989e93-9419-47e9-a3ba-7b5e65910be9-kube-api-access-9v2vj\") pod \"ovn-northd-0\" (UID: \"00989e93-9419-47e9-a3ba-7b5e65910be9\") " pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.154749 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wrhqd" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.194299 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-ovsdbserver-nb\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.194572 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-dns-svc\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.194738 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-config\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.195375 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-ovsdbserver-nb\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.195599 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-dns-svc\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.195839 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-config\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.195952 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-ovsdbserver-sb\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.196447 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8m2v\" (UniqueName: \"kubernetes.io/projected/95de2443-326d-455a-90d3-b71e616ffdd4-kube-api-access-m8m2v\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.196774 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-ovsdbserver-sb\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.212861 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8m2v\" (UniqueName: \"kubernetes.io/projected/95de2443-326d-455a-90d3-b71e616ffdd4-kube-api-access-m8m2v\") pod \"dnsmasq-dns-55f569ddf-7qd7f\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.278494 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.390377 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.445032 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.457334 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.591947 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wrhqd"] Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.603799 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-config\") pod \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.603853 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-ovsdbserver-nb\") pod \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.603951 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-dns-svc\") pod \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.604060 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq8t9\" (UniqueName: \"kubernetes.io/projected/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-kube-api-access-dq8t9\") pod \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\" (UID: \"eb6527d7-52ee-4cda-be40-5df08f3f4cd9\") " Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.605054 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-config" (OuterVolumeSpecName: "config") pod "eb6527d7-52ee-4cda-be40-5df08f3f4cd9" (UID: "eb6527d7-52ee-4cda-be40-5df08f3f4cd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.605279 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb6527d7-52ee-4cda-be40-5df08f3f4cd9" (UID: "eb6527d7-52ee-4cda-be40-5df08f3f4cd9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.605645 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb6527d7-52ee-4cda-be40-5df08f3f4cd9" (UID: "eb6527d7-52ee-4cda-be40-5df08f3f4cd9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.608883 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-kube-api-access-dq8t9" (OuterVolumeSpecName: "kube-api-access-dq8t9") pod "eb6527d7-52ee-4cda-be40-5df08f3f4cd9" (UID: "eb6527d7-52ee-4cda-be40-5df08f3f4cd9"). InnerVolumeSpecName "kube-api-access-dq8t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.652459 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f569ddf-7qd7f"] Feb 20 16:49:53 crc kubenswrapper[4697]: W0220 16:49:53.654704 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95de2443_326d_455a_90d3_b71e616ffdd4.slice/crio-cfa3376b73b29ea1cd296a9beb3f6fe68c4a22d0b88ddcd8a6a4ea1102ff5770 WatchSource:0}: Error finding container cfa3376b73b29ea1cd296a9beb3f6fe68c4a22d0b88ddcd8a6a4ea1102ff5770: Status 404 returned error can't find the container with id cfa3376b73b29ea1cd296a9beb3f6fe68c4a22d0b88ddcd8a6a4ea1102ff5770 Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.706715 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.707191 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.707207 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq8t9\" (UniqueName: \"kubernetes.io/projected/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-kube-api-access-dq8t9\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.707222 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb6527d7-52ee-4cda-be40-5df08f3f4cd9-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:49:53 crc kubenswrapper[4697]: I0220 16:49:53.761837 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.455099 4697 generic.go:334] "Generic (PLEG): container finished" podID="95de2443-326d-455a-90d3-b71e616ffdd4" containerID="4b29de8b8b5ceab35b7daf4810da526f1cd3fe248ba08a43240f10dec6a94092" exitCode=0 Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.455156 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" event={"ID":"95de2443-326d-455a-90d3-b71e616ffdd4","Type":"ContainerDied","Data":"4b29de8b8b5ceab35b7daf4810da526f1cd3fe248ba08a43240f10dec6a94092"} Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.455467 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" event={"ID":"95de2443-326d-455a-90d3-b71e616ffdd4","Type":"ContainerStarted","Data":"cfa3376b73b29ea1cd296a9beb3f6fe68c4a22d0b88ddcd8a6a4ea1102ff5770"} Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.457106 4697 generic.go:334] "Generic (PLEG): container finished" podID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerID="c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e" exitCode=0 Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.457199 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c08b5df3-217d-41d0-b021-d29a0b7e7dd2","Type":"ContainerDied","Data":"c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e"} Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.459054 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wrhqd" event={"ID":"650a1475-c144-4a21-a156-9859cb1418d4","Type":"ContainerStarted","Data":"4f2367e83f4a2e2e231dc801d8b32b99882052e0b61234cf400eebe7dc81da4a"} Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.459110 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wrhqd" event={"ID":"650a1475-c144-4a21-a156-9859cb1418d4","Type":"ContainerStarted","Data":"e463ba4b00b6b34bd8e79ac0a80ce11843228afafd2a20acefb085a358401f46"} Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.461572 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00989e93-9419-47e9-a3ba-7b5e65910be9","Type":"ContainerStarted","Data":"46588c7af1412e1e9197b7af987b7b99e1b0b52b01be5518f95a4dd311bf5960"} Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.461779 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66bdd8b645-qjlbf" Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.531499 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-wrhqd" podStartSLOduration=2.531478436 podStartE2EDuration="2.531478436s" podCreationTimestamp="2026-02-20 16:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:49:54.52205523 +0000 UTC m=+1102.302100648" watchObservedRunningTime="2026-02-20 16:49:54.531478436 +0000 UTC m=+1102.311523854" Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.575134 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66bdd8b645-qjlbf"] Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.580365 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66bdd8b645-qjlbf"] Feb 20 16:49:54 crc kubenswrapper[4697]: I0220 16:49:54.886378 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb6527d7-52ee-4cda-be40-5df08f3f4cd9" path="/var/lib/kubelet/pods/eb6527d7-52ee-4cda-be40-5df08f3f4cd9/volumes" Feb 20 16:49:55 crc kubenswrapper[4697]: I0220 16:49:55.468252 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" event={"ID":"95de2443-326d-455a-90d3-b71e616ffdd4","Type":"ContainerStarted","Data":"6e585dd9103597f7f3584a745fbab063cbe90f8dc6931303520064525f330282"} Feb 20 16:49:55 crc kubenswrapper[4697]: I0220 16:49:55.468993 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:55 crc kubenswrapper[4697]: I0220 16:49:55.470085 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00989e93-9419-47e9-a3ba-7b5e65910be9","Type":"ContainerStarted","Data":"96c956b3546ffd22f84ce81097205977965a3ab200764520cb239aae68271aa4"} Feb 20 16:49:55 crc kubenswrapper[4697]: I0220 16:49:55.470115 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00989e93-9419-47e9-a3ba-7b5e65910be9","Type":"ContainerStarted","Data":"6e1052a68a12b2964ae0cbdd2a0a7f71419e6f18014839642d0a9e5261b4a27c"} Feb 20 16:49:55 crc kubenswrapper[4697]: I0220 16:49:55.470408 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 20 16:49:55 crc kubenswrapper[4697]: I0220 16:49:55.489637 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" podStartSLOduration=3.489622151 podStartE2EDuration="3.489622151s" podCreationTimestamp="2026-02-20 16:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:49:55.486725058 +0000 UTC m=+1103.266770466" watchObservedRunningTime="2026-02-20 16:49:55.489622151 +0000 UTC m=+1103.269667559" Feb 20 16:49:55 crc kubenswrapper[4697]: I0220 16:49:55.516677 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.046935881 podStartE2EDuration="3.516652759s" podCreationTimestamp="2026-02-20 16:49:52 +0000 UTC" firstStartedPulling="2026-02-20 16:49:53.775024664 +0000 UTC m=+1101.555070072" lastFinishedPulling="2026-02-20 16:49:54.244741542 +0000 UTC m=+1102.024786950" observedRunningTime="2026-02-20 16:49:55.50432828 +0000 UTC m=+1103.284373688" watchObservedRunningTime="2026-02-20 16:49:55.516652759 +0000 UTC m=+1103.296698177" Feb 20 16:49:57 crc kubenswrapper[4697]: I0220 16:49:57.486724 4697 generic.go:334] "Generic (PLEG): container finished" podID="33594c24-be5d-42de-ba91-5584becb21e3" containerID="adac9584f6f120b9cd938150ee23c05eb3e39e09ec6e7d52619c277d4f3bbf46" exitCode=0 Feb 20 16:49:57 crc kubenswrapper[4697]: I0220 16:49:57.486943 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"33594c24-be5d-42de-ba91-5584becb21e3","Type":"ContainerDied","Data":"adac9584f6f120b9cd938150ee23c05eb3e39e09ec6e7d52619c277d4f3bbf46"} Feb 20 16:49:57 crc kubenswrapper[4697]: I0220 16:49:57.490309 4697 generic.go:334] "Generic (PLEG): container finished" podID="b33962a9-1867-4e1c-b597-d426ecf83e50" containerID="87fb75c0477db7a4f4be6b9a7ec6f06296685ed93793645296f605d5cfd8174c" exitCode=0 Feb 20 16:49:57 crc kubenswrapper[4697]: I0220 16:49:57.490363 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b33962a9-1867-4e1c-b597-d426ecf83e50","Type":"ContainerDied","Data":"87fb75c0477db7a4f4be6b9a7ec6f06296685ed93793645296f605d5cfd8174c"} Feb 20 16:49:58 crc kubenswrapper[4697]: I0220 16:49:58.499647 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"33594c24-be5d-42de-ba91-5584becb21e3","Type":"ContainerStarted","Data":"f92cbf066958efe186c4060dccb15a94fb9b12f3a189009342fb77f8eac405ed"} Feb 20 16:49:58 crc kubenswrapper[4697]: I0220 16:49:58.502892 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b33962a9-1867-4e1c-b597-d426ecf83e50","Type":"ContainerStarted","Data":"a2ef900084f6f81ae928bc73bd93a0adc2f5a5ea5ef512c35f476f4e2097f595"} Feb 20 16:49:58 crc kubenswrapper[4697]: I0220 16:49:58.522270 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.871674697 podStartE2EDuration="34.52224917s" podCreationTimestamp="2026-02-20 16:49:24 +0000 UTC" firstStartedPulling="2026-02-20 16:49:35.126591317 +0000 UTC m=+1082.906636725" lastFinishedPulling="2026-02-20 16:49:43.77716578 +0000 UTC m=+1091.557211198" observedRunningTime="2026-02-20 16:49:58.517335119 +0000 UTC m=+1106.297380527" watchObservedRunningTime="2026-02-20 16:49:58.52224917 +0000 UTC m=+1106.302294578" Feb 20 16:49:58 crc kubenswrapper[4697]: I0220 16:49:58.547526 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=25.957197844 podStartE2EDuration="33.54750845s" podCreationTimestamp="2026-02-20 16:49:25 +0000 UTC" firstStartedPulling="2026-02-20 16:49:35.623384324 +0000 UTC m=+1083.403429732" lastFinishedPulling="2026-02-20 16:49:43.21369493 +0000 UTC m=+1090.993740338" observedRunningTime="2026-02-20 16:49:58.535784922 +0000 UTC m=+1106.315830320" watchObservedRunningTime="2026-02-20 16:49:58.54750845 +0000 UTC m=+1106.327553858" Feb 20 16:49:59 crc kubenswrapper[4697]: I0220 16:49:59.815640 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 20 16:49:59 crc kubenswrapper[4697]: I0220 16:49:59.855132 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f569ddf-7qd7f"] Feb 20 16:49:59 crc kubenswrapper[4697]: I0220 16:49:59.855402 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" podUID="95de2443-326d-455a-90d3-b71e616ffdd4" containerName="dnsmasq-dns" containerID="cri-o://6e585dd9103597f7f3584a745fbab063cbe90f8dc6931303520064525f330282" gracePeriod=10 Feb 20 16:49:59 crc kubenswrapper[4697]: I0220 16:49:59.857652 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:49:59 crc kubenswrapper[4697]: I0220 16:49:59.919805 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8f6ffc97-zdlsd"] Feb 20 16:49:59 crc kubenswrapper[4697]: I0220 16:49:59.921065 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:49:59 crc kubenswrapper[4697]: I0220 16:49:59.962829 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8f6ffc97-zdlsd"] Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.014301 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-ovsdbserver-sb\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.014369 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5skr8\" (UniqueName: \"kubernetes.io/projected/51280d75-c146-43f2-967b-bb34e273f534-kube-api-access-5skr8\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.014398 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-ovsdbserver-nb\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.014455 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-dns-svc\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.014629 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-config\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.116256 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5skr8\" (UniqueName: \"kubernetes.io/projected/51280d75-c146-43f2-967b-bb34e273f534-kube-api-access-5skr8\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.116308 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-ovsdbserver-nb\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.116362 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-dns-svc\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.116421 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-config\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.116478 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-ovsdbserver-sb\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.117351 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-ovsdbserver-nb\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.117360 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-ovsdbserver-sb\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.117552 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-dns-svc\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.117652 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-config\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.141598 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5skr8\" (UniqueName: \"kubernetes.io/projected/51280d75-c146-43f2-967b-bb34e273f534-kube-api-access-5skr8\") pod \"dnsmasq-dns-b8f6ffc97-zdlsd\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.244089 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.520529 4697 generic.go:334] "Generic (PLEG): container finished" podID="95de2443-326d-455a-90d3-b71e616ffdd4" containerID="6e585dd9103597f7f3584a745fbab063cbe90f8dc6931303520064525f330282" exitCode=0 Feb 20 16:50:00 crc kubenswrapper[4697]: I0220 16:50:00.520561 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" event={"ID":"95de2443-326d-455a-90d3-b71e616ffdd4","Type":"ContainerDied","Data":"6e585dd9103597f7f3584a745fbab063cbe90f8dc6931303520064525f330282"} Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.112639 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.140414 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.149199 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-bbs48" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.149307 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.150121 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.150392 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.162621 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.187127 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.187215 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.187280 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.188136 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7914ab01639074e62ef59025fd1a7ee59be0ddca912a97f841ccc347f5312e8e"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.188220 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://7914ab01639074e62ef59025fd1a7ee59be0ddca912a97f841ccc347f5312e8e" gracePeriod=600 Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.260555 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-lock\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.260910 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjdbs\" (UniqueName: \"kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-kube-api-access-vjdbs\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.261037 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.261497 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.263069 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-cache\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.263107 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.364737 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjdbs\" (UniqueName: \"kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-kube-api-access-vjdbs\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.364785 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.364861 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.364907 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-cache\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.364926 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.364954 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-lock\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.365476 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-lock\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: E0220 16:50:01.365692 4697 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 16:50:01 crc kubenswrapper[4697]: E0220 16:50:01.365719 4697 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 16:50:01 crc kubenswrapper[4697]: E0220 16:50:01.365773 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift podName:8a8a227a-2c59-4ecd-a4c3-69c9018f1c13 nodeName:}" failed. No retries permitted until 2026-02-20 16:50:01.865746077 +0000 UTC m=+1109.645791485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift") pod "swift-storage-0" (UID: "8a8a227a-2c59-4ecd-a4c3-69c9018f1c13") : configmap "swift-ring-files" not found Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.365882 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-cache\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.366095 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.372094 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.385970 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjdbs\" (UniqueName: \"kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-kube-api-access-vjdbs\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.392312 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.528658 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="7914ab01639074e62ef59025fd1a7ee59be0ddca912a97f841ccc347f5312e8e" exitCode=0 Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.528947 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"7914ab01639074e62ef59025fd1a7ee59be0ddca912a97f841ccc347f5312e8e"} Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.528978 4697 scope.go:117] "RemoveContainer" containerID="ac1ff61636e81a13d334b99986c31ee9bcf221f2d7263a9112ad988ea78c70f4" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.651189 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-kwhn7"] Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.652314 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.654739 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.654761 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.654946 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.672406 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-kwhn7"] Feb 20 16:50:01 crc kubenswrapper[4697]: E0220 16:50:01.673932 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-c8mmb ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-c8mmb ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-kwhn7" podUID="f91a80f9-5d36-494c-b4e8-5d4e1397bbe5" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.684138 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-frqkk"] Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.685652 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.699567 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-frqkk"] Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.723777 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-kwhn7"] Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.770494 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a267ce98-60eb-4c3c-8906-de42f6872680-ring-data-devices\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.770535 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-dispersionconf\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.770686 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqc5q\" (UniqueName: \"kubernetes.io/projected/a267ce98-60eb-4c3c-8906-de42f6872680-kube-api-access-gqc5q\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.770722 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-swiftconf\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.770752 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-combined-ca-bundle\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.770830 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-dispersionconf\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.770850 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-scripts\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.770918 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-swiftconf\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.770989 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-etc-swift\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.771012 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a267ce98-60eb-4c3c-8906-de42f6872680-etc-swift\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.771067 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8mmb\" (UniqueName: \"kubernetes.io/projected/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-kube-api-access-c8mmb\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.771097 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-combined-ca-bundle\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.771148 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-ring-data-devices\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.771170 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a267ce98-60eb-4c3c-8906-de42f6872680-scripts\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.872353 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-swiftconf\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.872410 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-etc-swift\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.872440 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a267ce98-60eb-4c3c-8906-de42f6872680-etc-swift\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.872470 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8mmb\" (UniqueName: \"kubernetes.io/projected/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-kube-api-access-c8mmb\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.872491 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-combined-ca-bundle\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.872521 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-ring-data-devices\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.872570 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a267ce98-60eb-4c3c-8906-de42f6872680-scripts\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.872961 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-etc-swift\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.872969 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a267ce98-60eb-4c3c-8906-de42f6872680-etc-swift\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.873340 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-ring-data-devices\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.873366 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a267ce98-60eb-4c3c-8906-de42f6872680-scripts\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.873476 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a267ce98-60eb-4c3c-8906-de42f6872680-ring-data-devices\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.873497 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-dispersionconf\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.873541 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.873571 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqc5q\" (UniqueName: \"kubernetes.io/projected/a267ce98-60eb-4c3c-8906-de42f6872680-kube-api-access-gqc5q\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.873902 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-swiftconf\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.873928 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-combined-ca-bundle\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: E0220 16:50:01.873684 4697 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.873961 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-dispersionconf\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: E0220 16:50:01.873975 4697 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 16:50:01 crc kubenswrapper[4697]: E0220 16:50:01.874021 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift podName:8a8a227a-2c59-4ecd-a4c3-69c9018f1c13 nodeName:}" failed. No retries permitted until 2026-02-20 16:50:02.874008177 +0000 UTC m=+1110.654053585 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift") pod "swift-storage-0" (UID: "8a8a227a-2c59-4ecd-a4c3-69c9018f1c13") : configmap "swift-ring-files" not found Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.874016 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a267ce98-60eb-4c3c-8906-de42f6872680-ring-data-devices\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.873977 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-scripts\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.874515 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-scripts\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.877632 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-swiftconf\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.877931 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-dispersionconf\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.878183 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-dispersionconf\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.884750 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-combined-ca-bundle\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.885865 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-swiftconf\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.886144 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-combined-ca-bundle\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.901496 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8mmb\" (UniqueName: \"kubernetes.io/projected/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-kube-api-access-c8mmb\") pod \"swift-ring-rebalance-kwhn7\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:01 crc kubenswrapper[4697]: I0220 16:50:01.904356 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqc5q\" (UniqueName: \"kubernetes.io/projected/a267ce98-60eb-4c3c-8906-de42f6872680-kube-api-access-gqc5q\") pod \"swift-ring-rebalance-frqkk\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.002201 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.392959 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.486366 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8m2v\" (UniqueName: \"kubernetes.io/projected/95de2443-326d-455a-90d3-b71e616ffdd4-kube-api-access-m8m2v\") pod \"95de2443-326d-455a-90d3-b71e616ffdd4\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.486520 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-ovsdbserver-nb\") pod \"95de2443-326d-455a-90d3-b71e616ffdd4\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.486546 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-ovsdbserver-sb\") pod \"95de2443-326d-455a-90d3-b71e616ffdd4\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.486580 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-config\") pod \"95de2443-326d-455a-90d3-b71e616ffdd4\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.486717 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-dns-svc\") pod \"95de2443-326d-455a-90d3-b71e616ffdd4\" (UID: \"95de2443-326d-455a-90d3-b71e616ffdd4\") " Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.495657 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95de2443-326d-455a-90d3-b71e616ffdd4-kube-api-access-m8m2v" (OuterVolumeSpecName: "kube-api-access-m8m2v") pod "95de2443-326d-455a-90d3-b71e616ffdd4" (UID: "95de2443-326d-455a-90d3-b71e616ffdd4"). InnerVolumeSpecName "kube-api-access-m8m2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.535793 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "95de2443-326d-455a-90d3-b71e616ffdd4" (UID: "95de2443-326d-455a-90d3-b71e616ffdd4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.588124 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.588676 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8m2v\" (UniqueName: \"kubernetes.io/projected/95de2443-326d-455a-90d3-b71e616ffdd4-kube-api-access-m8m2v\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.592022 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "95de2443-326d-455a-90d3-b71e616ffdd4" (UID: "95de2443-326d-455a-90d3-b71e616ffdd4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.595035 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "95de2443-326d-455a-90d3-b71e616ffdd4" (UID: "95de2443-326d-455a-90d3-b71e616ffdd4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.626031 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"18b63bf23bfaf1519d7cd30ced77d1ca85c9b60f2bab25b83e9d358614cbd28d"} Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.637296 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" event={"ID":"95de2443-326d-455a-90d3-b71e616ffdd4","Type":"ContainerDied","Data":"cfa3376b73b29ea1cd296a9beb3f6fe68c4a22d0b88ddcd8a6a4ea1102ff5770"} Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.637353 4697 scope.go:117] "RemoveContainer" containerID="6e585dd9103597f7f3584a745fbab063cbe90f8dc6931303520064525f330282" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.637651 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f569ddf-7qd7f" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.637891 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-config" (OuterVolumeSpecName: "config") pod "95de2443-326d-455a-90d3-b71e616ffdd4" (UID: "95de2443-326d-455a-90d3-b71e616ffdd4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.649606 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.649758 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c08b5df3-217d-41d0-b021-d29a0b7e7dd2","Type":"ContainerStarted","Data":"0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6"} Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.653178 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8f6ffc97-zdlsd"] Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.665015 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-frqkk"] Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.669820 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:02 crc kubenswrapper[4697]: W0220 16:50:02.675586 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda267ce98_60eb_4c3c_8906_de42f6872680.slice/crio-9b8afe3bceef71d5bae0158a10e522386a99b243bec099bd94cab57a333be726 WatchSource:0}: Error finding container 9b8afe3bceef71d5bae0158a10e522386a99b243bec099bd94cab57a333be726: Status 404 returned error can't find the container with id 9b8afe3bceef71d5bae0158a10e522386a99b243bec099bd94cab57a333be726 Feb 20 16:50:02 crc kubenswrapper[4697]: W0220 16:50:02.677864 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51280d75_c146_43f2_967b_bb34e273f534.slice/crio-c55487e9d38def07e799de5bc7dec78d34fbf277348a5f958d0e89665db90f26 WatchSource:0}: Error finding container c55487e9d38def07e799de5bc7dec78d34fbf277348a5f958d0e89665db90f26: Status 404 returned error can't find the container with id c55487e9d38def07e799de5bc7dec78d34fbf277348a5f958d0e89665db90f26 Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.687795 4697 scope.go:117] "RemoveContainer" containerID="4b29de8b8b5ceab35b7daf4810da526f1cd3fe248ba08a43240f10dec6a94092" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.690147 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.690242 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.690313 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95de2443-326d-455a-90d3-b71e616ffdd4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.790964 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-etc-swift\") pod \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.791285 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-dispersionconf\") pod \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.791308 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-ring-data-devices\") pod \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.791348 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-swiftconf\") pod \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.791410 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8mmb\" (UniqueName: \"kubernetes.io/projected/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-kube-api-access-c8mmb\") pod \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.791533 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-combined-ca-bundle\") pod \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.791556 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-scripts\") pod \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\" (UID: \"f91a80f9-5d36-494c-b4e8-5d4e1397bbe5\") " Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.791582 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5" (UID: "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.792795 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5" (UID: "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.792832 4697 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.793081 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-scripts" (OuterVolumeSpecName: "scripts") pod "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5" (UID: "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.878414 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5" (UID: "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.878476 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5" (UID: "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.879118 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-kube-api-access-c8mmb" (OuterVolumeSpecName: "kube-api-access-c8mmb") pod "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5" (UID: "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5"). InnerVolumeSpecName "kube-api-access-c8mmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.891450 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5" (UID: "f91a80f9-5d36-494c-b4e8-5d4e1397bbe5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.894412 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.894506 4697 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.894519 4697 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.894531 4697 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.894543 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8mmb\" (UniqueName: \"kubernetes.io/projected/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-kube-api-access-c8mmb\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.894557 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.894570 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:02 crc kubenswrapper[4697]: E0220 16:50:02.894702 4697 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 16:50:02 crc kubenswrapper[4697]: E0220 16:50:02.894718 4697 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 16:50:02 crc kubenswrapper[4697]: E0220 16:50:02.894767 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift podName:8a8a227a-2c59-4ecd-a4c3-69c9018f1c13 nodeName:}" failed. No retries permitted until 2026-02-20 16:50:04.894749853 +0000 UTC m=+1112.674795261 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift") pod "swift-storage-0" (UID: "8a8a227a-2c59-4ecd-a4c3-69c9018f1c13") : configmap "swift-ring-files" not found Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.977917 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f569ddf-7qd7f"] Feb 20 16:50:02 crc kubenswrapper[4697]: I0220 16:50:02.983443 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f569ddf-7qd7f"] Feb 20 16:50:03 crc kubenswrapper[4697]: I0220 16:50:03.659170 4697 generic.go:334] "Generic (PLEG): container finished" podID="51280d75-c146-43f2-967b-bb34e273f534" containerID="ce86da0a4e0ad24afde9d0476338cef5ba1dd97a2eec0575c25123847bc56810" exitCode=0 Feb 20 16:50:03 crc kubenswrapper[4697]: I0220 16:50:03.659487 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" event={"ID":"51280d75-c146-43f2-967b-bb34e273f534","Type":"ContainerDied","Data":"ce86da0a4e0ad24afde9d0476338cef5ba1dd97a2eec0575c25123847bc56810"} Feb 20 16:50:03 crc kubenswrapper[4697]: I0220 16:50:03.659516 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" event={"ID":"51280d75-c146-43f2-967b-bb34e273f534","Type":"ContainerStarted","Data":"c55487e9d38def07e799de5bc7dec78d34fbf277348a5f958d0e89665db90f26"} Feb 20 16:50:03 crc kubenswrapper[4697]: I0220 16:50:03.661899 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-frqkk" event={"ID":"a267ce98-60eb-4c3c-8906-de42f6872680","Type":"ContainerStarted","Data":"9b8afe3bceef71d5bae0158a10e522386a99b243bec099bd94cab57a333be726"} Feb 20 16:50:03 crc kubenswrapper[4697]: I0220 16:50:03.664246 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kwhn7" Feb 20 16:50:03 crc kubenswrapper[4697]: I0220 16:50:03.711453 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-kwhn7"] Feb 20 16:50:03 crc kubenswrapper[4697]: I0220 16:50:03.720759 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-kwhn7"] Feb 20 16:50:04 crc kubenswrapper[4697]: I0220 16:50:04.885770 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95de2443-326d-455a-90d3-b71e616ffdd4" path="/var/lib/kubelet/pods/95de2443-326d-455a-90d3-b71e616ffdd4/volumes" Feb 20 16:50:04 crc kubenswrapper[4697]: I0220 16:50:04.887138 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91a80f9-5d36-494c-b4e8-5d4e1397bbe5" path="/var/lib/kubelet/pods/f91a80f9-5d36-494c-b4e8-5d4e1397bbe5/volumes" Feb 20 16:50:04 crc kubenswrapper[4697]: I0220 16:50:04.928536 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:04 crc kubenswrapper[4697]: E0220 16:50:04.928791 4697 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 16:50:04 crc kubenswrapper[4697]: E0220 16:50:04.928811 4697 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 16:50:04 crc kubenswrapper[4697]: E0220 16:50:04.928852 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift podName:8a8a227a-2c59-4ecd-a4c3-69c9018f1c13 nodeName:}" failed. No retries permitted until 2026-02-20 16:50:08.928838254 +0000 UTC m=+1116.708883662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift") pod "swift-storage-0" (UID: "8a8a227a-2c59-4ecd-a4c3-69c9018f1c13") : configmap "swift-ring-files" not found Feb 20 16:50:05 crc kubenswrapper[4697]: I0220 16:50:05.692289 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c08b5df3-217d-41d0-b021-d29a0b7e7dd2","Type":"ContainerStarted","Data":"547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba"} Feb 20 16:50:05 crc kubenswrapper[4697]: I0220 16:50:05.974659 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 20 16:50:05 crc kubenswrapper[4697]: I0220 16:50:05.974753 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 20 16:50:06 crc kubenswrapper[4697]: I0220 16:50:06.073419 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 20 16:50:06 crc kubenswrapper[4697]: I0220 16:50:06.702148 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" event={"ID":"51280d75-c146-43f2-967b-bb34e273f534","Type":"ContainerStarted","Data":"70a64ecf59ec1712cb0795c6b3a95abdb5efff8f175d2f6d4f12ce800db5be9c"} Feb 20 16:50:06 crc kubenswrapper[4697]: I0220 16:50:06.702370 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:06 crc kubenswrapper[4697]: I0220 16:50:06.708733 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-frqkk" event={"ID":"a267ce98-60eb-4c3c-8906-de42f6872680","Type":"ContainerStarted","Data":"602ff3c6feb76fc4b9db23ce75b9ed9432707cc7b4e4619e2457bdce4383f979"} Feb 20 16:50:06 crc kubenswrapper[4697]: I0220 16:50:06.736601 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" podStartSLOduration=7.736577995 podStartE2EDuration="7.736577995s" podCreationTimestamp="2026-02-20 16:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:06.728588799 +0000 UTC m=+1114.508634217" watchObservedRunningTime="2026-02-20 16:50:06.736577995 +0000 UTC m=+1114.516623413" Feb 20 16:50:06 crc kubenswrapper[4697]: I0220 16:50:06.749234 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-frqkk" podStartSLOduration=2.878845349 podStartE2EDuration="5.749218495s" podCreationTimestamp="2026-02-20 16:50:01 +0000 UTC" firstStartedPulling="2026-02-20 16:50:02.688002316 +0000 UTC m=+1110.468047724" lastFinishedPulling="2026-02-20 16:50:05.558375462 +0000 UTC m=+1113.338420870" observedRunningTime="2026-02-20 16:50:06.744851168 +0000 UTC m=+1114.524896566" watchObservedRunningTime="2026-02-20 16:50:06.749218495 +0000 UTC m=+1114.529263903" Feb 20 16:50:06 crc kubenswrapper[4697]: I0220 16:50:06.836027 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 20 16:50:07 crc kubenswrapper[4697]: I0220 16:50:07.247398 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 20 16:50:07 crc kubenswrapper[4697]: I0220 16:50:07.247461 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 20 16:50:07 crc kubenswrapper[4697]: I0220 16:50:07.406460 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.024224 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.742052 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0d09-account-create-update-d99d8"] Feb 20 16:50:08 crc kubenswrapper[4697]: E0220 16:50:08.742423 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95de2443-326d-455a-90d3-b71e616ffdd4" containerName="dnsmasq-dns" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.742446 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="95de2443-326d-455a-90d3-b71e616ffdd4" containerName="dnsmasq-dns" Feb 20 16:50:08 crc kubenswrapper[4697]: E0220 16:50:08.742459 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95de2443-326d-455a-90d3-b71e616ffdd4" containerName="init" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.742465 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="95de2443-326d-455a-90d3-b71e616ffdd4" containerName="init" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.742638 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="95de2443-326d-455a-90d3-b71e616ffdd4" containerName="dnsmasq-dns" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.743220 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0d09-account-create-update-d99d8" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.751874 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.755260 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0d09-account-create-update-d99d8"] Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.798248 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-dzvcx"] Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.801440 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dzvcx" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.807089 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sznhw\" (UniqueName: \"kubernetes.io/projected/f66cbebc-f05e-4503-9e6e-8877ea8904fb-kube-api-access-sznhw\") pod \"keystone-0d09-account-create-update-d99d8\" (UID: \"f66cbebc-f05e-4503-9e6e-8877ea8904fb\") " pod="openstack/keystone-0d09-account-create-update-d99d8" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.807335 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f66cbebc-f05e-4503-9e6e-8877ea8904fb-operator-scripts\") pod \"keystone-0d09-account-create-update-d99d8\" (UID: \"f66cbebc-f05e-4503-9e6e-8877ea8904fb\") " pod="openstack/keystone-0d09-account-create-update-d99d8" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.845386 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dzvcx"] Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.912293 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sznhw\" (UniqueName: \"kubernetes.io/projected/f66cbebc-f05e-4503-9e6e-8877ea8904fb-kube-api-access-sznhw\") pod \"keystone-0d09-account-create-update-d99d8\" (UID: \"f66cbebc-f05e-4503-9e6e-8877ea8904fb\") " pod="openstack/keystone-0d09-account-create-update-d99d8" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.912385 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmxpv\" (UniqueName: \"kubernetes.io/projected/a72078d9-fd39-496f-b41d-ffb493c9bc14-kube-api-access-qmxpv\") pod \"keystone-db-create-dzvcx\" (UID: \"a72078d9-fd39-496f-b41d-ffb493c9bc14\") " pod="openstack/keystone-db-create-dzvcx" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.912648 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a72078d9-fd39-496f-b41d-ffb493c9bc14-operator-scripts\") pod \"keystone-db-create-dzvcx\" (UID: \"a72078d9-fd39-496f-b41d-ffb493c9bc14\") " pod="openstack/keystone-db-create-dzvcx" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.912787 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f66cbebc-f05e-4503-9e6e-8877ea8904fb-operator-scripts\") pod \"keystone-0d09-account-create-update-d99d8\" (UID: \"f66cbebc-f05e-4503-9e6e-8877ea8904fb\") " pod="openstack/keystone-0d09-account-create-update-d99d8" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.913592 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f66cbebc-f05e-4503-9e6e-8877ea8904fb-operator-scripts\") pod \"keystone-0d09-account-create-update-d99d8\" (UID: \"f66cbebc-f05e-4503-9e6e-8877ea8904fb\") " pod="openstack/keystone-0d09-account-create-update-d99d8" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.949049 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-f27ln"] Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.950810 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f27ln" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.953629 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sznhw\" (UniqueName: \"kubernetes.io/projected/f66cbebc-f05e-4503-9e6e-8877ea8904fb-kube-api-access-sznhw\") pod \"keystone-0d09-account-create-update-d99d8\" (UID: \"f66cbebc-f05e-4503-9e6e-8877ea8904fb\") " pod="openstack/keystone-0d09-account-create-update-d99d8" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.972742 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-f27ln"] Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.975228 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e609-account-create-update-r9q9h"] Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.976890 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e609-account-create-update-r9q9h" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.984110 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 20 16:50:08 crc kubenswrapper[4697]: I0220 16:50:08.998239 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e609-account-create-update-r9q9h"] Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.013842 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f93ead78-caf8-4bda-a0be-ef377041fb5a-operator-scripts\") pod \"placement-db-create-f27ln\" (UID: \"f93ead78-caf8-4bda-a0be-ef377041fb5a\") " pod="openstack/placement-db-create-f27ln" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.014046 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.014154 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmxpv\" (UniqueName: \"kubernetes.io/projected/a72078d9-fd39-496f-b41d-ffb493c9bc14-kube-api-access-qmxpv\") pod \"keystone-db-create-dzvcx\" (UID: \"a72078d9-fd39-496f-b41d-ffb493c9bc14\") " pod="openstack/keystone-db-create-dzvcx" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.014291 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nx9s\" (UniqueName: \"kubernetes.io/projected/f93ead78-caf8-4bda-a0be-ef377041fb5a-kube-api-access-2nx9s\") pod \"placement-db-create-f27ln\" (UID: \"f93ead78-caf8-4bda-a0be-ef377041fb5a\") " pod="openstack/placement-db-create-f27ln" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.014361 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a72078d9-fd39-496f-b41d-ffb493c9bc14-operator-scripts\") pod \"keystone-db-create-dzvcx\" (UID: \"a72078d9-fd39-496f-b41d-ffb493c9bc14\") " pod="openstack/keystone-db-create-dzvcx" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.015035 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a72078d9-fd39-496f-b41d-ffb493c9bc14-operator-scripts\") pod \"keystone-db-create-dzvcx\" (UID: \"a72078d9-fd39-496f-b41d-ffb493c9bc14\") " pod="openstack/keystone-db-create-dzvcx" Feb 20 16:50:09 crc kubenswrapper[4697]: E0220 16:50:09.015831 4697 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 16:50:09 crc kubenswrapper[4697]: E0220 16:50:09.015912 4697 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 16:50:09 crc kubenswrapper[4697]: E0220 16:50:09.015995 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift podName:8a8a227a-2c59-4ecd-a4c3-69c9018f1c13 nodeName:}" failed. No retries permitted until 2026-02-20 16:50:17.015982839 +0000 UTC m=+1124.796028247 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift") pod "swift-storage-0" (UID: "8a8a227a-2c59-4ecd-a4c3-69c9018f1c13") : configmap "swift-ring-files" not found Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.032035 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmxpv\" (UniqueName: \"kubernetes.io/projected/a72078d9-fd39-496f-b41d-ffb493c9bc14-kube-api-access-qmxpv\") pod \"keystone-db-create-dzvcx\" (UID: \"a72078d9-fd39-496f-b41d-ffb493c9bc14\") " pod="openstack/keystone-db-create-dzvcx" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.115859 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8dmc\" (UniqueName: \"kubernetes.io/projected/36c5e0bd-8650-46b9-a189-72bf5090b0f7-kube-api-access-g8dmc\") pod \"placement-e609-account-create-update-r9q9h\" (UID: \"36c5e0bd-8650-46b9-a189-72bf5090b0f7\") " pod="openstack/placement-e609-account-create-update-r9q9h" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.116009 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nx9s\" (UniqueName: \"kubernetes.io/projected/f93ead78-caf8-4bda-a0be-ef377041fb5a-kube-api-access-2nx9s\") pod \"placement-db-create-f27ln\" (UID: \"f93ead78-caf8-4bda-a0be-ef377041fb5a\") " pod="openstack/placement-db-create-f27ln" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.116070 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36c5e0bd-8650-46b9-a189-72bf5090b0f7-operator-scripts\") pod \"placement-e609-account-create-update-r9q9h\" (UID: \"36c5e0bd-8650-46b9-a189-72bf5090b0f7\") " pod="openstack/placement-e609-account-create-update-r9q9h" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.116105 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f93ead78-caf8-4bda-a0be-ef377041fb5a-operator-scripts\") pod \"placement-db-create-f27ln\" (UID: \"f93ead78-caf8-4bda-a0be-ef377041fb5a\") " pod="openstack/placement-db-create-f27ln" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.117009 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f93ead78-caf8-4bda-a0be-ef377041fb5a-operator-scripts\") pod \"placement-db-create-f27ln\" (UID: \"f93ead78-caf8-4bda-a0be-ef377041fb5a\") " pod="openstack/placement-db-create-f27ln" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.131683 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dzvcx" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.139315 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nx9s\" (UniqueName: \"kubernetes.io/projected/f93ead78-caf8-4bda-a0be-ef377041fb5a-kube-api-access-2nx9s\") pod \"placement-db-create-f27ln\" (UID: \"f93ead78-caf8-4bda-a0be-ef377041fb5a\") " pod="openstack/placement-db-create-f27ln" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.217446 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8dmc\" (UniqueName: \"kubernetes.io/projected/36c5e0bd-8650-46b9-a189-72bf5090b0f7-kube-api-access-g8dmc\") pod \"placement-e609-account-create-update-r9q9h\" (UID: \"36c5e0bd-8650-46b9-a189-72bf5090b0f7\") " pod="openstack/placement-e609-account-create-update-r9q9h" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.217583 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36c5e0bd-8650-46b9-a189-72bf5090b0f7-operator-scripts\") pod \"placement-e609-account-create-update-r9q9h\" (UID: \"36c5e0bd-8650-46b9-a189-72bf5090b0f7\") " pod="openstack/placement-e609-account-create-update-r9q9h" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.218421 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36c5e0bd-8650-46b9-a189-72bf5090b0f7-operator-scripts\") pod \"placement-e609-account-create-update-r9q9h\" (UID: \"36c5e0bd-8650-46b9-a189-72bf5090b0f7\") " pod="openstack/placement-e609-account-create-update-r9q9h" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.234252 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8dmc\" (UniqueName: \"kubernetes.io/projected/36c5e0bd-8650-46b9-a189-72bf5090b0f7-kube-api-access-g8dmc\") pod \"placement-e609-account-create-update-r9q9h\" (UID: \"36c5e0bd-8650-46b9-a189-72bf5090b0f7\") " pod="openstack/placement-e609-account-create-update-r9q9h" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.551116 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f27ln" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.551270 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e609-account-create-update-r9q9h" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.554221 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0d09-account-create-update-d99d8" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.855566 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-fm2pl"] Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.856594 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-fm2pl" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.864737 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-fm2pl"] Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.927466 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s87fq\" (UniqueName: \"kubernetes.io/projected/4bdc7f7e-de85-48cb-b75d-969ff2d39d14-kube-api-access-s87fq\") pod \"watcher-db-create-fm2pl\" (UID: \"4bdc7f7e-de85-48cb-b75d-969ff2d39d14\") " pod="openstack/watcher-db-create-fm2pl" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.927569 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bdc7f7e-de85-48cb-b75d-969ff2d39d14-operator-scripts\") pod \"watcher-db-create-fm2pl\" (UID: \"4bdc7f7e-de85-48cb-b75d-969ff2d39d14\") " pod="openstack/watcher-db-create-fm2pl" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.943308 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-8382-account-create-update-fnxqt"] Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.945245 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8382-account-create-update-fnxqt" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.947365 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Feb 20 16:50:09 crc kubenswrapper[4697]: I0220 16:50:09.951354 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-8382-account-create-update-fnxqt"] Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.029268 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/222d06f8-6f3e-45ce-9acd-18c6d624edf4-operator-scripts\") pod \"watcher-8382-account-create-update-fnxqt\" (UID: \"222d06f8-6f3e-45ce-9acd-18c6d624edf4\") " pod="openstack/watcher-8382-account-create-update-fnxqt" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.029388 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bdc7f7e-de85-48cb-b75d-969ff2d39d14-operator-scripts\") pod \"watcher-db-create-fm2pl\" (UID: \"4bdc7f7e-de85-48cb-b75d-969ff2d39d14\") " pod="openstack/watcher-db-create-fm2pl" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.029524 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9vxl\" (UniqueName: \"kubernetes.io/projected/222d06f8-6f3e-45ce-9acd-18c6d624edf4-kube-api-access-j9vxl\") pod \"watcher-8382-account-create-update-fnxqt\" (UID: \"222d06f8-6f3e-45ce-9acd-18c6d624edf4\") " pod="openstack/watcher-8382-account-create-update-fnxqt" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.029768 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s87fq\" (UniqueName: \"kubernetes.io/projected/4bdc7f7e-de85-48cb-b75d-969ff2d39d14-kube-api-access-s87fq\") pod \"watcher-db-create-fm2pl\" (UID: \"4bdc7f7e-de85-48cb-b75d-969ff2d39d14\") " pod="openstack/watcher-db-create-fm2pl" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.030171 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bdc7f7e-de85-48cb-b75d-969ff2d39d14-operator-scripts\") pod \"watcher-db-create-fm2pl\" (UID: \"4bdc7f7e-de85-48cb-b75d-969ff2d39d14\") " pod="openstack/watcher-db-create-fm2pl" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.044929 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s87fq\" (UniqueName: \"kubernetes.io/projected/4bdc7f7e-de85-48cb-b75d-969ff2d39d14-kube-api-access-s87fq\") pod \"watcher-db-create-fm2pl\" (UID: \"4bdc7f7e-de85-48cb-b75d-969ff2d39d14\") " pod="openstack/watcher-db-create-fm2pl" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.131512 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/222d06f8-6f3e-45ce-9acd-18c6d624edf4-operator-scripts\") pod \"watcher-8382-account-create-update-fnxqt\" (UID: \"222d06f8-6f3e-45ce-9acd-18c6d624edf4\") " pod="openstack/watcher-8382-account-create-update-fnxqt" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.131576 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9vxl\" (UniqueName: \"kubernetes.io/projected/222d06f8-6f3e-45ce-9acd-18c6d624edf4-kube-api-access-j9vxl\") pod \"watcher-8382-account-create-update-fnxqt\" (UID: \"222d06f8-6f3e-45ce-9acd-18c6d624edf4\") " pod="openstack/watcher-8382-account-create-update-fnxqt" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.132361 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/222d06f8-6f3e-45ce-9acd-18c6d624edf4-operator-scripts\") pod \"watcher-8382-account-create-update-fnxqt\" (UID: \"222d06f8-6f3e-45ce-9acd-18c6d624edf4\") " pod="openstack/watcher-8382-account-create-update-fnxqt" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.150253 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9vxl\" (UniqueName: \"kubernetes.io/projected/222d06f8-6f3e-45ce-9acd-18c6d624edf4-kube-api-access-j9vxl\") pod \"watcher-8382-account-create-update-fnxqt\" (UID: \"222d06f8-6f3e-45ce-9acd-18c6d624edf4\") " pod="openstack/watcher-8382-account-create-update-fnxqt" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.174042 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-fm2pl" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.245669 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.262273 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8382-account-create-update-fnxqt" Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.322678 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b44f8bbb5-5v6fh"] Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.322942 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" podUID="afbfdf08-d446-4739-b771-0244cb6001d9" containerName="dnsmasq-dns" containerID="cri-o://535212505c095546a59f4d325dbc916a41c1820cff5dabab7db4fcffc07c8b46" gracePeriod=10 Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.782747 4697 generic.go:334] "Generic (PLEG): container finished" podID="afbfdf08-d446-4739-b771-0244cb6001d9" containerID="535212505c095546a59f4d325dbc916a41c1820cff5dabab7db4fcffc07c8b46" exitCode=0 Feb 20 16:50:10 crc kubenswrapper[4697]: I0220 16:50:10.782797 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" event={"ID":"afbfdf08-d446-4739-b771-0244cb6001d9","Type":"ContainerDied","Data":"535212505c095546a59f4d325dbc916a41c1820cff5dabab7db4fcffc07c8b46"} Feb 20 16:50:11 crc kubenswrapper[4697]: I0220 16:50:11.963235 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.068764 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzhzq\" (UniqueName: \"kubernetes.io/projected/afbfdf08-d446-4739-b771-0244cb6001d9-kube-api-access-tzhzq\") pod \"afbfdf08-d446-4739-b771-0244cb6001d9\" (UID: \"afbfdf08-d446-4739-b771-0244cb6001d9\") " Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.068803 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afbfdf08-d446-4739-b771-0244cb6001d9-dns-svc\") pod \"afbfdf08-d446-4739-b771-0244cb6001d9\" (UID: \"afbfdf08-d446-4739-b771-0244cb6001d9\") " Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.068899 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afbfdf08-d446-4739-b771-0244cb6001d9-config\") pod \"afbfdf08-d446-4739-b771-0244cb6001d9\" (UID: \"afbfdf08-d446-4739-b771-0244cb6001d9\") " Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.075164 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afbfdf08-d446-4739-b771-0244cb6001d9-kube-api-access-tzhzq" (OuterVolumeSpecName: "kube-api-access-tzhzq") pod "afbfdf08-d446-4739-b771-0244cb6001d9" (UID: "afbfdf08-d446-4739-b771-0244cb6001d9"). InnerVolumeSpecName "kube-api-access-tzhzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.098250 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-fm2pl"] Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.110312 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e609-account-create-update-r9q9h"] Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.138245 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afbfdf08-d446-4739-b771-0244cb6001d9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "afbfdf08-d446-4739-b771-0244cb6001d9" (UID: "afbfdf08-d446-4739-b771-0244cb6001d9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.147071 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0d09-account-create-update-d99d8"] Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.148944 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afbfdf08-d446-4739-b771-0244cb6001d9-config" (OuterVolumeSpecName: "config") pod "afbfdf08-d446-4739-b771-0244cb6001d9" (UID: "afbfdf08-d446-4739-b771-0244cb6001d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:12 crc kubenswrapper[4697]: W0220 16:50:12.157290 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf66cbebc_f05e_4503_9e6e_8877ea8904fb.slice/crio-1391f140434a28f9344309645e00cb8859c411ef128896d7d214f8c2926f7050 WatchSource:0}: Error finding container 1391f140434a28f9344309645e00cb8859c411ef128896d7d214f8c2926f7050: Status 404 returned error can't find the container with id 1391f140434a28f9344309645e00cb8859c411ef128896d7d214f8c2926f7050 Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.162088 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-f27ln"] Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.172987 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzhzq\" (UniqueName: \"kubernetes.io/projected/afbfdf08-d446-4739-b771-0244cb6001d9-kube-api-access-tzhzq\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.173033 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afbfdf08-d446-4739-b771-0244cb6001d9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.173042 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afbfdf08-d446-4739-b771-0244cb6001d9-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.178826 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dzvcx"] Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.296049 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-8382-account-create-update-fnxqt"] Feb 20 16:50:12 crc kubenswrapper[4697]: W0220 16:50:12.319789 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod222d06f8_6f3e_45ce_9acd_18c6d624edf4.slice/crio-57968330576d530402034a52d78d69201af5a916fe92410bb92225ddf96d5be4 WatchSource:0}: Error finding container 57968330576d530402034a52d78d69201af5a916fe92410bb92225ddf96d5be4: Status 404 returned error can't find the container with id 57968330576d530402034a52d78d69201af5a916fe92410bb92225ddf96d5be4 Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.713919 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2lhlv"] Feb 20 16:50:12 crc kubenswrapper[4697]: E0220 16:50:12.716676 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afbfdf08-d446-4739-b771-0244cb6001d9" containerName="dnsmasq-dns" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.716713 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbfdf08-d446-4739-b771-0244cb6001d9" containerName="dnsmasq-dns" Feb 20 16:50:12 crc kubenswrapper[4697]: E0220 16:50:12.716728 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afbfdf08-d446-4739-b771-0244cb6001d9" containerName="init" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.716739 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbfdf08-d446-4739-b771-0244cb6001d9" containerName="init" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.716995 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="afbfdf08-d446-4739-b771-0244cb6001d9" containerName="dnsmasq-dns" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.717774 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2lhlv" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.725208 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2lhlv"] Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.780502 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkjfp\" (UniqueName: \"kubernetes.io/projected/dde77bd7-58d5-4213-9793-7f8549cb4ff5-kube-api-access-dkjfp\") pod \"glance-db-create-2lhlv\" (UID: \"dde77bd7-58d5-4213-9793-7f8549cb4ff5\") " pod="openstack/glance-db-create-2lhlv" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.780565 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dde77bd7-58d5-4213-9793-7f8549cb4ff5-operator-scripts\") pod \"glance-db-create-2lhlv\" (UID: \"dde77bd7-58d5-4213-9793-7f8549cb4ff5\") " pod="openstack/glance-db-create-2lhlv" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.796727 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8382-account-create-update-fnxqt" event={"ID":"222d06f8-6f3e-45ce-9acd-18c6d624edf4","Type":"ContainerStarted","Data":"92074cc543c50d5e7f01442d6308c34e68e974cd89665a1bc2ba3e1b5e68c9cf"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.796771 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8382-account-create-update-fnxqt" event={"ID":"222d06f8-6f3e-45ce-9acd-18c6d624edf4","Type":"ContainerStarted","Data":"57968330576d530402034a52d78d69201af5a916fe92410bb92225ddf96d5be4"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.798367 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dzvcx" event={"ID":"a72078d9-fd39-496f-b41d-ffb493c9bc14","Type":"ContainerStarted","Data":"08f209d6039c536d1f4a314b16d3302cdb150f865b94fe9edde4ddbcae026cea"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.798406 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dzvcx" event={"ID":"a72078d9-fd39-496f-b41d-ffb493c9bc14","Type":"ContainerStarted","Data":"b0301b426ac742cfc2dc7145e907043bbe1f8dc119866b0248f5d4d6ff0581cc"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.800677 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c08b5df3-217d-41d0-b021-d29a0b7e7dd2","Type":"ContainerStarted","Data":"c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.802133 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0d09-account-create-update-d99d8" event={"ID":"f66cbebc-f05e-4503-9e6e-8877ea8904fb","Type":"ContainerStarted","Data":"babe786dd755d89365a5b351ff85e2fa16212923176a0da69ab437d303a8627e"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.802164 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0d09-account-create-update-d99d8" event={"ID":"f66cbebc-f05e-4503-9e6e-8877ea8904fb","Type":"ContainerStarted","Data":"1391f140434a28f9344309645e00cb8859c411ef128896d7d214f8c2926f7050"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.803845 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-fm2pl" event={"ID":"4bdc7f7e-de85-48cb-b75d-969ff2d39d14","Type":"ContainerStarted","Data":"c94a65cd6dc725cf5daf669470a052a9e002cfc0db79a402ec0bebdf8a7c5a0d"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.803868 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-fm2pl" event={"ID":"4bdc7f7e-de85-48cb-b75d-969ff2d39d14","Type":"ContainerStarted","Data":"eba097d8fa030c39363c8e0f253be5b16fcfca0c760dabe6ccb4e3d9a16007ea"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.806684 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" event={"ID":"afbfdf08-d446-4739-b771-0244cb6001d9","Type":"ContainerDied","Data":"9d0afd6a96239f57fca698856bac7c21be94a75247903d3458015219bd033e75"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.806746 4697 scope.go:117] "RemoveContainer" containerID="535212505c095546a59f4d325dbc916a41c1820cff5dabab7db4fcffc07c8b46" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.806698 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b44f8bbb5-5v6fh" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.810894 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-8382-account-create-update-fnxqt" podStartSLOduration=3.810879638 podStartE2EDuration="3.810879638s" podCreationTimestamp="2026-02-20 16:50:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:12.810078788 +0000 UTC m=+1120.590124196" watchObservedRunningTime="2026-02-20 16:50:12.810879638 +0000 UTC m=+1120.590925046" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.817380 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f27ln" event={"ID":"f93ead78-caf8-4bda-a0be-ef377041fb5a","Type":"ContainerStarted","Data":"a9055d73a8156fed58aa4f45342897bca77336c25719dff76e5cde28821f946b"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.817421 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f27ln" event={"ID":"f93ead78-caf8-4bda-a0be-ef377041fb5a","Type":"ContainerStarted","Data":"5213eab09f9d0a85413915e5b874476d808801c3954cd2701fb25b672c145272"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.820687 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e609-account-create-update-r9q9h" event={"ID":"36c5e0bd-8650-46b9-a189-72bf5090b0f7","Type":"ContainerStarted","Data":"9291241e7f3e6e433f9d5756c9ccfb27273b0773b3c37ae187be09ed03573bea"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.820798 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e609-account-create-update-r9q9h" event={"ID":"36c5e0bd-8650-46b9-a189-72bf5090b0f7","Type":"ContainerStarted","Data":"cfca5b45cab64efd2cba1aa7543adb571fdeacb45496f94812eb2e3b3642f05b"} Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.835372 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a1b1-account-create-update-2w7nj"] Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.836592 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a1b1-account-create-update-2w7nj" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.842990 4697 scope.go:117] "RemoveContainer" containerID="335864a5710161af316b9847516e428794469603a90c5a170802e7cd4bb18bfa" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.844238 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=7.113182732 podStartE2EDuration="43.844218396s" podCreationTimestamp="2026-02-20 16:49:29 +0000 UTC" firstStartedPulling="2026-02-20 16:49:35.477607965 +0000 UTC m=+1083.257653373" lastFinishedPulling="2026-02-20 16:50:12.208643629 +0000 UTC m=+1119.988689037" observedRunningTime="2026-02-20 16:50:12.833654387 +0000 UTC m=+1120.613699805" watchObservedRunningTime="2026-02-20 16:50:12.844218396 +0000 UTC m=+1120.624263804" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.845853 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.882224 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkjfp\" (UniqueName: \"kubernetes.io/projected/dde77bd7-58d5-4213-9793-7f8549cb4ff5-kube-api-access-dkjfp\") pod \"glance-db-create-2lhlv\" (UID: \"dde77bd7-58d5-4213-9793-7f8549cb4ff5\") " pod="openstack/glance-db-create-2lhlv" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.882314 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d7ds\" (UniqueName: \"kubernetes.io/projected/b1be73ad-831a-4fab-9b34-646e75b34e01-kube-api-access-2d7ds\") pod \"glance-a1b1-account-create-update-2w7nj\" (UID: \"b1be73ad-831a-4fab-9b34-646e75b34e01\") " pod="openstack/glance-a1b1-account-create-update-2w7nj" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.882347 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dde77bd7-58d5-4213-9793-7f8549cb4ff5-operator-scripts\") pod \"glance-db-create-2lhlv\" (UID: \"dde77bd7-58d5-4213-9793-7f8549cb4ff5\") " pod="openstack/glance-db-create-2lhlv" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.882442 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1be73ad-831a-4fab-9b34-646e75b34e01-operator-scripts\") pod \"glance-a1b1-account-create-update-2w7nj\" (UID: \"b1be73ad-831a-4fab-9b34-646e75b34e01\") " pod="openstack/glance-a1b1-account-create-update-2w7nj" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.887207 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dde77bd7-58d5-4213-9793-7f8549cb4ff5-operator-scripts\") pod \"glance-db-create-2lhlv\" (UID: \"dde77bd7-58d5-4213-9793-7f8549cb4ff5\") " pod="openstack/glance-db-create-2lhlv" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.897597 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-dzvcx" podStartSLOduration=4.897581017 podStartE2EDuration="4.897581017s" podCreationTimestamp="2026-02-20 16:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:12.872161873 +0000 UTC m=+1120.652207281" watchObservedRunningTime="2026-02-20 16:50:12.897581017 +0000 UTC m=+1120.677626425" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.903290 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a1b1-account-create-update-2w7nj"] Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.911339 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkjfp\" (UniqueName: \"kubernetes.io/projected/dde77bd7-58d5-4213-9793-7f8549cb4ff5-kube-api-access-dkjfp\") pod \"glance-db-create-2lhlv\" (UID: \"dde77bd7-58d5-4213-9793-7f8549cb4ff5\") " pod="openstack/glance-db-create-2lhlv" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.911936 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-create-fm2pl" podStartSLOduration=3.911921709 podStartE2EDuration="3.911921709s" podCreationTimestamp="2026-02-20 16:50:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:12.891354724 +0000 UTC m=+1120.671400152" watchObservedRunningTime="2026-02-20 16:50:12.911921709 +0000 UTC m=+1120.691967117" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.912423 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-0d09-account-create-update-d99d8" podStartSLOduration=4.912416531 podStartE2EDuration="4.912416531s" podCreationTimestamp="2026-02-20 16:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:12.903752648 +0000 UTC m=+1120.683798056" watchObservedRunningTime="2026-02-20 16:50:12.912416531 +0000 UTC m=+1120.692461939" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.934291 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-e609-account-create-update-r9q9h" podStartSLOduration=4.9342515670000004 podStartE2EDuration="4.934251567s" podCreationTimestamp="2026-02-20 16:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:12.925604045 +0000 UTC m=+1120.705649453" watchObservedRunningTime="2026-02-20 16:50:12.934251567 +0000 UTC m=+1120.714296975" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.951360 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b44f8bbb5-5v6fh"] Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.984027 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1be73ad-831a-4fab-9b34-646e75b34e01-operator-scripts\") pod \"glance-a1b1-account-create-update-2w7nj\" (UID: \"b1be73ad-831a-4fab-9b34-646e75b34e01\") " pod="openstack/glance-a1b1-account-create-update-2w7nj" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.984403 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d7ds\" (UniqueName: \"kubernetes.io/projected/b1be73ad-831a-4fab-9b34-646e75b34e01-kube-api-access-2d7ds\") pod \"glance-a1b1-account-create-update-2w7nj\" (UID: \"b1be73ad-831a-4fab-9b34-646e75b34e01\") " pod="openstack/glance-a1b1-account-create-update-2w7nj" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.988517 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b44f8bbb5-5v6fh"] Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.988706 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1be73ad-831a-4fab-9b34-646e75b34e01-operator-scripts\") pod \"glance-a1b1-account-create-update-2w7nj\" (UID: \"b1be73ad-831a-4fab-9b34-646e75b34e01\") " pod="openstack/glance-a1b1-account-create-update-2w7nj" Feb 20 16:50:12 crc kubenswrapper[4697]: I0220 16:50:12.994737 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-f27ln" podStartSLOduration=4.994715712 podStartE2EDuration="4.994715712s" podCreationTimestamp="2026-02-20 16:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:12.962300366 +0000 UTC m=+1120.742345804" watchObservedRunningTime="2026-02-20 16:50:12.994715712 +0000 UTC m=+1120.774761120" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.002282 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d7ds\" (UniqueName: \"kubernetes.io/projected/b1be73ad-831a-4fab-9b34-646e75b34e01-kube-api-access-2d7ds\") pod \"glance-a1b1-account-create-update-2w7nj\" (UID: \"b1be73ad-831a-4fab-9b34-646e75b34e01\") " pod="openstack/glance-a1b1-account-create-update-2w7nj" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.039137 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2lhlv" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.076671 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7ljxq"] Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.078548 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.083680 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ljxq"] Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.153182 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a1b1-account-create-update-2w7nj" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.186918 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765973da-6ef4-44c6-823c-5670b03c5900-catalog-content\") pod \"redhat-marketplace-7ljxq\" (UID: \"765973da-6ef4-44c6-823c-5670b03c5900\") " pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.186992 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765973da-6ef4-44c6-823c-5670b03c5900-utilities\") pod \"redhat-marketplace-7ljxq\" (UID: \"765973da-6ef4-44c6-823c-5670b03c5900\") " pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.187061 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2c8j\" (UniqueName: \"kubernetes.io/projected/765973da-6ef4-44c6-823c-5670b03c5900-kube-api-access-n2c8j\") pod \"redhat-marketplace-7ljxq\" (UID: \"765973da-6ef4-44c6-823c-5670b03c5900\") " pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.291000 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765973da-6ef4-44c6-823c-5670b03c5900-utilities\") pod \"redhat-marketplace-7ljxq\" (UID: \"765973da-6ef4-44c6-823c-5670b03c5900\") " pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.291088 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2c8j\" (UniqueName: \"kubernetes.io/projected/765973da-6ef4-44c6-823c-5670b03c5900-kube-api-access-n2c8j\") pod \"redhat-marketplace-7ljxq\" (UID: \"765973da-6ef4-44c6-823c-5670b03c5900\") " pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.291159 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765973da-6ef4-44c6-823c-5670b03c5900-catalog-content\") pod \"redhat-marketplace-7ljxq\" (UID: \"765973da-6ef4-44c6-823c-5670b03c5900\") " pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.294519 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765973da-6ef4-44c6-823c-5670b03c5900-catalog-content\") pod \"redhat-marketplace-7ljxq\" (UID: \"765973da-6ef4-44c6-823c-5670b03c5900\") " pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.294752 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765973da-6ef4-44c6-823c-5670b03c5900-utilities\") pod \"redhat-marketplace-7ljxq\" (UID: \"765973da-6ef4-44c6-823c-5670b03c5900\") " pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.342304 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2c8j\" (UniqueName: \"kubernetes.io/projected/765973da-6ef4-44c6-823c-5670b03c5900-kube-api-access-n2c8j\") pod \"redhat-marketplace-7ljxq\" (UID: \"765973da-6ef4-44c6-823c-5670b03c5900\") " pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.397362 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.429710 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.585554 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2lhlv"] Feb 20 16:50:13 crc kubenswrapper[4697]: W0220 16:50:13.596318 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddde77bd7_58d5_4213_9793_7f8549cb4ff5.slice/crio-0e1b0bd974a251e1b999fe2c267239d2e3cb27013125a6cab929bbfe9dc67a6b WatchSource:0}: Error finding container 0e1b0bd974a251e1b999fe2c267239d2e3cb27013125a6cab929bbfe9dc67a6b: Status 404 returned error can't find the container with id 0e1b0bd974a251e1b999fe2c267239d2e3cb27013125a6cab929bbfe9dc67a6b Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.759230 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a1b1-account-create-update-2w7nj"] Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.836187 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2lhlv" event={"ID":"dde77bd7-58d5-4213-9793-7f8549cb4ff5","Type":"ContainerStarted","Data":"0e1b0bd974a251e1b999fe2c267239d2e3cb27013125a6cab929bbfe9dc67a6b"} Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.849225 4697 generic.go:334] "Generic (PLEG): container finished" podID="222d06f8-6f3e-45ce-9acd-18c6d624edf4" containerID="92074cc543c50d5e7f01442d6308c34e68e974cd89665a1bc2ba3e1b5e68c9cf" exitCode=0 Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.849300 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8382-account-create-update-fnxqt" event={"ID":"222d06f8-6f3e-45ce-9acd-18c6d624edf4","Type":"ContainerDied","Data":"92074cc543c50d5e7f01442d6308c34e68e974cd89665a1bc2ba3e1b5e68c9cf"} Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.854201 4697 generic.go:334] "Generic (PLEG): container finished" podID="4bdc7f7e-de85-48cb-b75d-969ff2d39d14" containerID="c94a65cd6dc725cf5daf669470a052a9e002cfc0db79a402ec0bebdf8a7c5a0d" exitCode=0 Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.854290 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-fm2pl" event={"ID":"4bdc7f7e-de85-48cb-b75d-969ff2d39d14","Type":"ContainerDied","Data":"c94a65cd6dc725cf5daf669470a052a9e002cfc0db79a402ec0bebdf8a7c5a0d"} Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.867691 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a1b1-account-create-update-2w7nj" event={"ID":"b1be73ad-831a-4fab-9b34-646e75b34e01","Type":"ContainerStarted","Data":"38a5a2f554a79e3990a3141799abd402c9765b6065d92b1eb271a1dd6f0190d9"} Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.875194 4697 generic.go:334] "Generic (PLEG): container finished" podID="f93ead78-caf8-4bda-a0be-ef377041fb5a" containerID="a9055d73a8156fed58aa4f45342897bca77336c25719dff76e5cde28821f946b" exitCode=0 Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.875254 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f27ln" event={"ID":"f93ead78-caf8-4bda-a0be-ef377041fb5a","Type":"ContainerDied","Data":"a9055d73a8156fed58aa4f45342897bca77336c25719dff76e5cde28821f946b"} Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.877767 4697 generic.go:334] "Generic (PLEG): container finished" podID="f66cbebc-f05e-4503-9e6e-8877ea8904fb" containerID="babe786dd755d89365a5b351ff85e2fa16212923176a0da69ab437d303a8627e" exitCode=0 Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.878043 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0d09-account-create-update-d99d8" event={"ID":"f66cbebc-f05e-4503-9e6e-8877ea8904fb","Type":"ContainerDied","Data":"babe786dd755d89365a5b351ff85e2fa16212923176a0da69ab437d303a8627e"} Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.886339 4697 generic.go:334] "Generic (PLEG): container finished" podID="36c5e0bd-8650-46b9-a189-72bf5090b0f7" containerID="9291241e7f3e6e433f9d5756c9ccfb27273b0773b3c37ae187be09ed03573bea" exitCode=0 Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.886409 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e609-account-create-update-r9q9h" event={"ID":"36c5e0bd-8650-46b9-a189-72bf5090b0f7","Type":"ContainerDied","Data":"9291241e7f3e6e433f9d5756c9ccfb27273b0773b3c37ae187be09ed03573bea"} Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.889406 4697 generic.go:334] "Generic (PLEG): container finished" podID="a72078d9-fd39-496f-b41d-ffb493c9bc14" containerID="08f209d6039c536d1f4a314b16d3302cdb150f865b94fe9edde4ddbcae026cea" exitCode=0 Feb 20 16:50:13 crc kubenswrapper[4697]: I0220 16:50:13.890259 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dzvcx" event={"ID":"a72078d9-fd39-496f-b41d-ffb493c9bc14","Type":"ContainerDied","Data":"08f209d6039c536d1f4a314b16d3302cdb150f865b94fe9edde4ddbcae026cea"} Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.004049 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ljxq"] Feb 20 16:50:14 crc kubenswrapper[4697]: W0220 16:50:14.004566 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod765973da_6ef4_44c6_823c_5670b03c5900.slice/crio-534d03c3ada0b52b989e9f63079bba70eb48846c3ac3385b668cd0baeef99f27 WatchSource:0}: Error finding container 534d03c3ada0b52b989e9f63079bba70eb48846c3ac3385b668cd0baeef99f27: Status 404 returned error can't find the container with id 534d03c3ada0b52b989e9f63079bba70eb48846c3ac3385b668cd0baeef99f27 Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.633374 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9gklb"] Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.634786 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9gklb" Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.637465 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.646005 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9gklb"] Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.717981 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlwz\" (UniqueName: \"kubernetes.io/projected/3ee5893d-3486-4ec4-9bab-76c57cd02c74-kube-api-access-qrlwz\") pod \"root-account-create-update-9gklb\" (UID: \"3ee5893d-3486-4ec4-9bab-76c57cd02c74\") " pod="openstack/root-account-create-update-9gklb" Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.718097 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ee5893d-3486-4ec4-9bab-76c57cd02c74-operator-scripts\") pod \"root-account-create-update-9gklb\" (UID: \"3ee5893d-3486-4ec4-9bab-76c57cd02c74\") " pod="openstack/root-account-create-update-9gklb" Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.819636 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ee5893d-3486-4ec4-9bab-76c57cd02c74-operator-scripts\") pod \"root-account-create-update-9gklb\" (UID: \"3ee5893d-3486-4ec4-9bab-76c57cd02c74\") " pod="openstack/root-account-create-update-9gklb" Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.819868 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlwz\" (UniqueName: \"kubernetes.io/projected/3ee5893d-3486-4ec4-9bab-76c57cd02c74-kube-api-access-qrlwz\") pod \"root-account-create-update-9gklb\" (UID: \"3ee5893d-3486-4ec4-9bab-76c57cd02c74\") " pod="openstack/root-account-create-update-9gklb" Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.820745 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ee5893d-3486-4ec4-9bab-76c57cd02c74-operator-scripts\") pod \"root-account-create-update-9gklb\" (UID: \"3ee5893d-3486-4ec4-9bab-76c57cd02c74\") " pod="openstack/root-account-create-update-9gklb" Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.843040 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlwz\" (UniqueName: \"kubernetes.io/projected/3ee5893d-3486-4ec4-9bab-76c57cd02c74-kube-api-access-qrlwz\") pod \"root-account-create-update-9gklb\" (UID: \"3ee5893d-3486-4ec4-9bab-76c57cd02c74\") " pod="openstack/root-account-create-update-9gklb" Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.902893 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afbfdf08-d446-4739-b771-0244cb6001d9" path="/var/lib/kubelet/pods/afbfdf08-d446-4739-b771-0244cb6001d9/volumes" Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.918348 4697 generic.go:334] "Generic (PLEG): container finished" podID="b1be73ad-831a-4fab-9b34-646e75b34e01" containerID="9de9d4761129d68c8bd03e71ca2487c9f0cf1fbe0609307192957117cddcdfe9" exitCode=0 Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.918406 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a1b1-account-create-update-2w7nj" event={"ID":"b1be73ad-831a-4fab-9b34-646e75b34e01","Type":"ContainerDied","Data":"9de9d4761129d68c8bd03e71ca2487c9f0cf1fbe0609307192957117cddcdfe9"} Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.920216 4697 generic.go:334] "Generic (PLEG): container finished" podID="765973da-6ef4-44c6-823c-5670b03c5900" containerID="d8f1c25e72ddbd0f90826a02ecde90644bf87732184fe1cab2d7fa2f912ec797" exitCode=0 Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.920298 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ljxq" event={"ID":"765973da-6ef4-44c6-823c-5670b03c5900","Type":"ContainerDied","Data":"d8f1c25e72ddbd0f90826a02ecde90644bf87732184fe1cab2d7fa2f912ec797"} Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.920333 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ljxq" event={"ID":"765973da-6ef4-44c6-823c-5670b03c5900","Type":"ContainerStarted","Data":"534d03c3ada0b52b989e9f63079bba70eb48846c3ac3385b668cd0baeef99f27"} Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.926627 4697 generic.go:334] "Generic (PLEG): container finished" podID="dde77bd7-58d5-4213-9793-7f8549cb4ff5" containerID="ea5ac79f7997f03e2b75422aadf6790db8fa562e94beabc4cac6e060d02aa0f3" exitCode=0 Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.926911 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2lhlv" event={"ID":"dde77bd7-58d5-4213-9793-7f8549cb4ff5","Type":"ContainerDied","Data":"ea5ac79f7997f03e2b75422aadf6790db8fa562e94beabc4cac6e060d02aa0f3"} Feb 20 16:50:14 crc kubenswrapper[4697]: I0220 16:50:14.954300 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9gklb" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.378373 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f27ln" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.539023 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nx9s\" (UniqueName: \"kubernetes.io/projected/f93ead78-caf8-4bda-a0be-ef377041fb5a-kube-api-access-2nx9s\") pod \"f93ead78-caf8-4bda-a0be-ef377041fb5a\" (UID: \"f93ead78-caf8-4bda-a0be-ef377041fb5a\") " Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.539114 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f93ead78-caf8-4bda-a0be-ef377041fb5a-operator-scripts\") pod \"f93ead78-caf8-4bda-a0be-ef377041fb5a\" (UID: \"f93ead78-caf8-4bda-a0be-ef377041fb5a\") " Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.541112 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f93ead78-caf8-4bda-a0be-ef377041fb5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f93ead78-caf8-4bda-a0be-ef377041fb5a" (UID: "f93ead78-caf8-4bda-a0be-ef377041fb5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.558964 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93ead78-caf8-4bda-a0be-ef377041fb5a-kube-api-access-2nx9s" (OuterVolumeSpecName: "kube-api-access-2nx9s") pod "f93ead78-caf8-4bda-a0be-ef377041fb5a" (UID: "f93ead78-caf8-4bda-a0be-ef377041fb5a"). InnerVolumeSpecName "kube-api-access-2nx9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.610563 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0d09-account-create-update-d99d8" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.620944 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8382-account-create-update-fnxqt" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.635792 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-fm2pl" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.639609 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dzvcx" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.640793 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nx9s\" (UniqueName: \"kubernetes.io/projected/f93ead78-caf8-4bda-a0be-ef377041fb5a-kube-api-access-2nx9s\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.640818 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f93ead78-caf8-4bda-a0be-ef377041fb5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.675337 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e609-account-create-update-r9q9h" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.741375 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s87fq\" (UniqueName: \"kubernetes.io/projected/4bdc7f7e-de85-48cb-b75d-969ff2d39d14-kube-api-access-s87fq\") pod \"4bdc7f7e-de85-48cb-b75d-969ff2d39d14\" (UID: \"4bdc7f7e-de85-48cb-b75d-969ff2d39d14\") " Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.741452 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a72078d9-fd39-496f-b41d-ffb493c9bc14-operator-scripts\") pod \"a72078d9-fd39-496f-b41d-ffb493c9bc14\" (UID: \"a72078d9-fd39-496f-b41d-ffb493c9bc14\") " Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.741501 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/222d06f8-6f3e-45ce-9acd-18c6d624edf4-operator-scripts\") pod \"222d06f8-6f3e-45ce-9acd-18c6d624edf4\" (UID: \"222d06f8-6f3e-45ce-9acd-18c6d624edf4\") " Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.741525 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9vxl\" (UniqueName: \"kubernetes.io/projected/222d06f8-6f3e-45ce-9acd-18c6d624edf4-kube-api-access-j9vxl\") pod \"222d06f8-6f3e-45ce-9acd-18c6d624edf4\" (UID: \"222d06f8-6f3e-45ce-9acd-18c6d624edf4\") " Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.741551 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bdc7f7e-de85-48cb-b75d-969ff2d39d14-operator-scripts\") pod \"4bdc7f7e-de85-48cb-b75d-969ff2d39d14\" (UID: \"4bdc7f7e-de85-48cb-b75d-969ff2d39d14\") " Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.741641 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sznhw\" (UniqueName: \"kubernetes.io/projected/f66cbebc-f05e-4503-9e6e-8877ea8904fb-kube-api-access-sznhw\") pod \"f66cbebc-f05e-4503-9e6e-8877ea8904fb\" (UID: \"f66cbebc-f05e-4503-9e6e-8877ea8904fb\") " Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.741685 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmxpv\" (UniqueName: \"kubernetes.io/projected/a72078d9-fd39-496f-b41d-ffb493c9bc14-kube-api-access-qmxpv\") pod \"a72078d9-fd39-496f-b41d-ffb493c9bc14\" (UID: \"a72078d9-fd39-496f-b41d-ffb493c9bc14\") " Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.741745 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f66cbebc-f05e-4503-9e6e-8877ea8904fb-operator-scripts\") pod \"f66cbebc-f05e-4503-9e6e-8877ea8904fb\" (UID: \"f66cbebc-f05e-4503-9e6e-8877ea8904fb\") " Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.743389 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f66cbebc-f05e-4503-9e6e-8877ea8904fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f66cbebc-f05e-4503-9e6e-8877ea8904fb" (UID: "f66cbebc-f05e-4503-9e6e-8877ea8904fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.744755 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a72078d9-fd39-496f-b41d-ffb493c9bc14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a72078d9-fd39-496f-b41d-ffb493c9bc14" (UID: "a72078d9-fd39-496f-b41d-ffb493c9bc14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.744966 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bdc7f7e-de85-48cb-b75d-969ff2d39d14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4bdc7f7e-de85-48cb-b75d-969ff2d39d14" (UID: "4bdc7f7e-de85-48cb-b75d-969ff2d39d14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.746012 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/222d06f8-6f3e-45ce-9acd-18c6d624edf4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "222d06f8-6f3e-45ce-9acd-18c6d624edf4" (UID: "222d06f8-6f3e-45ce-9acd-18c6d624edf4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.750250 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f66cbebc-f05e-4503-9e6e-8877ea8904fb-kube-api-access-sznhw" (OuterVolumeSpecName: "kube-api-access-sznhw") pod "f66cbebc-f05e-4503-9e6e-8877ea8904fb" (UID: "f66cbebc-f05e-4503-9e6e-8877ea8904fb"). InnerVolumeSpecName "kube-api-access-sznhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.750547 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72078d9-fd39-496f-b41d-ffb493c9bc14-kube-api-access-qmxpv" (OuterVolumeSpecName: "kube-api-access-qmxpv") pod "a72078d9-fd39-496f-b41d-ffb493c9bc14" (UID: "a72078d9-fd39-496f-b41d-ffb493c9bc14"). InnerVolumeSpecName "kube-api-access-qmxpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.750676 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/222d06f8-6f3e-45ce-9acd-18c6d624edf4-kube-api-access-j9vxl" (OuterVolumeSpecName: "kube-api-access-j9vxl") pod "222d06f8-6f3e-45ce-9acd-18c6d624edf4" (UID: "222d06f8-6f3e-45ce-9acd-18c6d624edf4"). InnerVolumeSpecName "kube-api-access-j9vxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.754886 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bdc7f7e-de85-48cb-b75d-969ff2d39d14-kube-api-access-s87fq" (OuterVolumeSpecName: "kube-api-access-s87fq") pod "4bdc7f7e-de85-48cb-b75d-969ff2d39d14" (UID: "4bdc7f7e-de85-48cb-b75d-969ff2d39d14"). InnerVolumeSpecName "kube-api-access-s87fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.814224 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9gklb"] Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.844359 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36c5e0bd-8650-46b9-a189-72bf5090b0f7-operator-scripts\") pod \"36c5e0bd-8650-46b9-a189-72bf5090b0f7\" (UID: \"36c5e0bd-8650-46b9-a189-72bf5090b0f7\") " Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.844490 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8dmc\" (UniqueName: \"kubernetes.io/projected/36c5e0bd-8650-46b9-a189-72bf5090b0f7-kube-api-access-g8dmc\") pod \"36c5e0bd-8650-46b9-a189-72bf5090b0f7\" (UID: \"36c5e0bd-8650-46b9-a189-72bf5090b0f7\") " Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.845174 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmxpv\" (UniqueName: \"kubernetes.io/projected/a72078d9-fd39-496f-b41d-ffb493c9bc14-kube-api-access-qmxpv\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.845196 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f66cbebc-f05e-4503-9e6e-8877ea8904fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.845206 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s87fq\" (UniqueName: \"kubernetes.io/projected/4bdc7f7e-de85-48cb-b75d-969ff2d39d14-kube-api-access-s87fq\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.845215 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a72078d9-fd39-496f-b41d-ffb493c9bc14-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.845222 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/222d06f8-6f3e-45ce-9acd-18c6d624edf4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.845233 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9vxl\" (UniqueName: \"kubernetes.io/projected/222d06f8-6f3e-45ce-9acd-18c6d624edf4-kube-api-access-j9vxl\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.845241 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bdc7f7e-de85-48cb-b75d-969ff2d39d14-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.845249 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sznhw\" (UniqueName: \"kubernetes.io/projected/f66cbebc-f05e-4503-9e6e-8877ea8904fb-kube-api-access-sznhw\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.845974 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36c5e0bd-8650-46b9-a189-72bf5090b0f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36c5e0bd-8650-46b9-a189-72bf5090b0f7" (UID: "36c5e0bd-8650-46b9-a189-72bf5090b0f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.849231 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c5e0bd-8650-46b9-a189-72bf5090b0f7-kube-api-access-g8dmc" (OuterVolumeSpecName: "kube-api-access-g8dmc") pod "36c5e0bd-8650-46b9-a189-72bf5090b0f7" (UID: "36c5e0bd-8650-46b9-a189-72bf5090b0f7"). InnerVolumeSpecName "kube-api-access-g8dmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.944291 4697 generic.go:334] "Generic (PLEG): container finished" podID="765973da-6ef4-44c6-823c-5670b03c5900" containerID="aecf1c037a93e5924982cc68fbff4549bccb459c3817afef621cb031da7578e7" exitCode=0 Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.944341 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ljxq" event={"ID":"765973da-6ef4-44c6-823c-5670b03c5900","Type":"ContainerDied","Data":"aecf1c037a93e5924982cc68fbff4549bccb459c3817afef621cb031da7578e7"} Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.946858 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36c5e0bd-8650-46b9-a189-72bf5090b0f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.946877 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8dmc\" (UniqueName: \"kubernetes.io/projected/36c5e0bd-8650-46b9-a189-72bf5090b0f7-kube-api-access-g8dmc\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.948750 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dzvcx" event={"ID":"a72078d9-fd39-496f-b41d-ffb493c9bc14","Type":"ContainerDied","Data":"b0301b426ac742cfc2dc7145e907043bbe1f8dc119866b0248f5d4d6ff0581cc"} Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.948773 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dzvcx" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.948782 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0301b426ac742cfc2dc7145e907043bbe1f8dc119866b0248f5d4d6ff0581cc" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.951607 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e609-account-create-update-r9q9h" event={"ID":"36c5e0bd-8650-46b9-a189-72bf5090b0f7","Type":"ContainerDied","Data":"cfca5b45cab64efd2cba1aa7543adb571fdeacb45496f94812eb2e3b3642f05b"} Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.951642 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfca5b45cab64efd2cba1aa7543adb571fdeacb45496f94812eb2e3b3642f05b" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.951647 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e609-account-create-update-r9q9h" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.955123 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8382-account-create-update-fnxqt" event={"ID":"222d06f8-6f3e-45ce-9acd-18c6d624edf4","Type":"ContainerDied","Data":"57968330576d530402034a52d78d69201af5a916fe92410bb92225ddf96d5be4"} Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.955164 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8382-account-create-update-fnxqt" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.955169 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57968330576d530402034a52d78d69201af5a916fe92410bb92225ddf96d5be4" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.956470 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0d09-account-create-update-d99d8" event={"ID":"f66cbebc-f05e-4503-9e6e-8877ea8904fb","Type":"ContainerDied","Data":"1391f140434a28f9344309645e00cb8859c411ef128896d7d214f8c2926f7050"} Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.956487 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1391f140434a28f9344309645e00cb8859c411ef128896d7d214f8c2926f7050" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.956499 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0d09-account-create-update-d99d8" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.957877 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-fm2pl" event={"ID":"4bdc7f7e-de85-48cb-b75d-969ff2d39d14","Type":"ContainerDied","Data":"eba097d8fa030c39363c8e0f253be5b16fcfca0c760dabe6ccb4e3d9a16007ea"} Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.957916 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eba097d8fa030c39363c8e0f253be5b16fcfca0c760dabe6ccb4e3d9a16007ea" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.957971 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-fm2pl" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.970027 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f27ln" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.970018 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f27ln" event={"ID":"f93ead78-caf8-4bda-a0be-ef377041fb5a","Type":"ContainerDied","Data":"5213eab09f9d0a85413915e5b874476d808801c3954cd2701fb25b672c145272"} Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.970196 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5213eab09f9d0a85413915e5b874476d808801c3954cd2701fb25b672c145272" Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.971426 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9gklb" event={"ID":"3ee5893d-3486-4ec4-9bab-76c57cd02c74","Type":"ContainerStarted","Data":"bb1a04af436f6d0132605862df365bde0f46ce6485cef4b15d7758a5be6140f9"} Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.974857 4697 generic.go:334] "Generic (PLEG): container finished" podID="a267ce98-60eb-4c3c-8906-de42f6872680" containerID="602ff3c6feb76fc4b9db23ce75b9ed9432707cc7b4e4619e2457bdce4383f979" exitCode=0 Feb 20 16:50:15 crc kubenswrapper[4697]: I0220 16:50:15.975015 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-frqkk" event={"ID":"a267ce98-60eb-4c3c-8906-de42f6872680","Type":"ContainerDied","Data":"602ff3c6feb76fc4b9db23ce75b9ed9432707cc7b4e4619e2457bdce4383f979"} Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.182459 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.182808 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.184942 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.377626 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a1b1-account-create-update-2w7nj" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.457255 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d7ds\" (UniqueName: \"kubernetes.io/projected/b1be73ad-831a-4fab-9b34-646e75b34e01-kube-api-access-2d7ds\") pod \"b1be73ad-831a-4fab-9b34-646e75b34e01\" (UID: \"b1be73ad-831a-4fab-9b34-646e75b34e01\") " Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.457305 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1be73ad-831a-4fab-9b34-646e75b34e01-operator-scripts\") pod \"b1be73ad-831a-4fab-9b34-646e75b34e01\" (UID: \"b1be73ad-831a-4fab-9b34-646e75b34e01\") " Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.457986 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1be73ad-831a-4fab-9b34-646e75b34e01-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1be73ad-831a-4fab-9b34-646e75b34e01" (UID: "b1be73ad-831a-4fab-9b34-646e75b34e01"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.460919 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1be73ad-831a-4fab-9b34-646e75b34e01-kube-api-access-2d7ds" (OuterVolumeSpecName: "kube-api-access-2d7ds") pod "b1be73ad-831a-4fab-9b34-646e75b34e01" (UID: "b1be73ad-831a-4fab-9b34-646e75b34e01"). InnerVolumeSpecName "kube-api-access-2d7ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.516157 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2lhlv" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.558874 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d7ds\" (UniqueName: \"kubernetes.io/projected/b1be73ad-831a-4fab-9b34-646e75b34e01-kube-api-access-2d7ds\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.559080 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1be73ad-831a-4fab-9b34-646e75b34e01-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.660219 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dde77bd7-58d5-4213-9793-7f8549cb4ff5-operator-scripts\") pod \"dde77bd7-58d5-4213-9793-7f8549cb4ff5\" (UID: \"dde77bd7-58d5-4213-9793-7f8549cb4ff5\") " Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.660658 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkjfp\" (UniqueName: \"kubernetes.io/projected/dde77bd7-58d5-4213-9793-7f8549cb4ff5-kube-api-access-dkjfp\") pod \"dde77bd7-58d5-4213-9793-7f8549cb4ff5\" (UID: \"dde77bd7-58d5-4213-9793-7f8549cb4ff5\") " Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.661062 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde77bd7-58d5-4213-9793-7f8549cb4ff5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dde77bd7-58d5-4213-9793-7f8549cb4ff5" (UID: "dde77bd7-58d5-4213-9793-7f8549cb4ff5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.661558 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dde77bd7-58d5-4213-9793-7f8549cb4ff5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.666782 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde77bd7-58d5-4213-9793-7f8549cb4ff5-kube-api-access-dkjfp" (OuterVolumeSpecName: "kube-api-access-dkjfp") pod "dde77bd7-58d5-4213-9793-7f8549cb4ff5" (UID: "dde77bd7-58d5-4213-9793-7f8549cb4ff5"). InnerVolumeSpecName "kube-api-access-dkjfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.763129 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkjfp\" (UniqueName: \"kubernetes.io/projected/dde77bd7-58d5-4213-9793-7f8549cb4ff5-kube-api-access-dkjfp\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.983583 4697 generic.go:334] "Generic (PLEG): container finished" podID="3ee5893d-3486-4ec4-9bab-76c57cd02c74" containerID="8e53ff167cd9abe968c8a185ef948cbf23e017beb3536675e043146d957b5ff9" exitCode=0 Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.983689 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9gklb" event={"ID":"3ee5893d-3486-4ec4-9bab-76c57cd02c74","Type":"ContainerDied","Data":"8e53ff167cd9abe968c8a185ef948cbf23e017beb3536675e043146d957b5ff9"} Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.986306 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2lhlv" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.986305 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2lhlv" event={"ID":"dde77bd7-58d5-4213-9793-7f8549cb4ff5","Type":"ContainerDied","Data":"0e1b0bd974a251e1b999fe2c267239d2e3cb27013125a6cab929bbfe9dc67a6b"} Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.986345 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e1b0bd974a251e1b999fe2c267239d2e3cb27013125a6cab929bbfe9dc67a6b" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.987807 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a1b1-account-create-update-2w7nj" event={"ID":"b1be73ad-831a-4fab-9b34-646e75b34e01","Type":"ContainerDied","Data":"38a5a2f554a79e3990a3141799abd402c9765b6065d92b1eb271a1dd6f0190d9"} Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.987842 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38a5a2f554a79e3990a3141799abd402c9765b6065d92b1eb271a1dd6f0190d9" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.987881 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a1b1-account-create-update-2w7nj" Feb 20 16:50:16 crc kubenswrapper[4697]: I0220 16:50:16.989175 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.069298 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.082152 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8a8a227a-2c59-4ecd-a4c3-69c9018f1c13-etc-swift\") pod \"swift-storage-0\" (UID: \"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13\") " pod="openstack/swift-storage-0" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.313741 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.374797 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.412179 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a267ce98-60eb-4c3c-8906-de42f6872680-scripts\") pod \"a267ce98-60eb-4c3c-8906-de42f6872680\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.412230 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-dispersionconf\") pod \"a267ce98-60eb-4c3c-8906-de42f6872680\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.412257 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-swiftconf\") pod \"a267ce98-60eb-4c3c-8906-de42f6872680\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.412283 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a267ce98-60eb-4c3c-8906-de42f6872680-ring-data-devices\") pod \"a267ce98-60eb-4c3c-8906-de42f6872680\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.412306 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-combined-ca-bundle\") pod \"a267ce98-60eb-4c3c-8906-de42f6872680\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.412327 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a267ce98-60eb-4c3c-8906-de42f6872680-etc-swift\") pod \"a267ce98-60eb-4c3c-8906-de42f6872680\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.412461 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqc5q\" (UniqueName: \"kubernetes.io/projected/a267ce98-60eb-4c3c-8906-de42f6872680-kube-api-access-gqc5q\") pod \"a267ce98-60eb-4c3c-8906-de42f6872680\" (UID: \"a267ce98-60eb-4c3c-8906-de42f6872680\") " Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.417394 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a267ce98-60eb-4c3c-8906-de42f6872680-kube-api-access-gqc5q" (OuterVolumeSpecName: "kube-api-access-gqc5q") pod "a267ce98-60eb-4c3c-8906-de42f6872680" (UID: "a267ce98-60eb-4c3c-8906-de42f6872680"). InnerVolumeSpecName "kube-api-access-gqc5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.417790 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a267ce98-60eb-4c3c-8906-de42f6872680-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a267ce98-60eb-4c3c-8906-de42f6872680" (UID: "a267ce98-60eb-4c3c-8906-de42f6872680"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.428977 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a267ce98-60eb-4c3c-8906-de42f6872680-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a267ce98-60eb-4c3c-8906-de42f6872680" (UID: "a267ce98-60eb-4c3c-8906-de42f6872680"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.442846 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a267ce98-60eb-4c3c-8906-de42f6872680" (UID: "a267ce98-60eb-4c3c-8906-de42f6872680"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.448906 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a267ce98-60eb-4c3c-8906-de42f6872680" (UID: "a267ce98-60eb-4c3c-8906-de42f6872680"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.449993 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a267ce98-60eb-4c3c-8906-de42f6872680-scripts" (OuterVolumeSpecName: "scripts") pod "a267ce98-60eb-4c3c-8906-de42f6872680" (UID: "a267ce98-60eb-4c3c-8906-de42f6872680"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.466290 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a267ce98-60eb-4c3c-8906-de42f6872680" (UID: "a267ce98-60eb-4c3c-8906-de42f6872680"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.521104 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqc5q\" (UniqueName: \"kubernetes.io/projected/a267ce98-60eb-4c3c-8906-de42f6872680-kube-api-access-gqc5q\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.521338 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a267ce98-60eb-4c3c-8906-de42f6872680-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.521348 4697 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.521357 4697 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.521365 4697 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a267ce98-60eb-4c3c-8906-de42f6872680-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.521373 4697 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a267ce98-60eb-4c3c-8906-de42f6872680-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.521381 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a267ce98-60eb-4c3c-8906-de42f6872680-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.922605 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wtqdz"] Feb 20 16:50:17 crc kubenswrapper[4697]: E0220 16:50:17.923004 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a267ce98-60eb-4c3c-8906-de42f6872680" containerName="swift-ring-rebalance" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923024 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a267ce98-60eb-4c3c-8906-de42f6872680" containerName="swift-ring-rebalance" Feb 20 16:50:17 crc kubenswrapper[4697]: E0220 16:50:17.923041 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222d06f8-6f3e-45ce-9acd-18c6d624edf4" containerName="mariadb-account-create-update" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923049 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="222d06f8-6f3e-45ce-9acd-18c6d624edf4" containerName="mariadb-account-create-update" Feb 20 16:50:17 crc kubenswrapper[4697]: E0220 16:50:17.923063 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c5e0bd-8650-46b9-a189-72bf5090b0f7" containerName="mariadb-account-create-update" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923071 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c5e0bd-8650-46b9-a189-72bf5090b0f7" containerName="mariadb-account-create-update" Feb 20 16:50:17 crc kubenswrapper[4697]: E0220 16:50:17.923086 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93ead78-caf8-4bda-a0be-ef377041fb5a" containerName="mariadb-database-create" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923094 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93ead78-caf8-4bda-a0be-ef377041fb5a" containerName="mariadb-database-create" Feb 20 16:50:17 crc kubenswrapper[4697]: E0220 16:50:17.923109 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66cbebc-f05e-4503-9e6e-8877ea8904fb" containerName="mariadb-account-create-update" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923117 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66cbebc-f05e-4503-9e6e-8877ea8904fb" containerName="mariadb-account-create-update" Feb 20 16:50:17 crc kubenswrapper[4697]: E0220 16:50:17.923131 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde77bd7-58d5-4213-9793-7f8549cb4ff5" containerName="mariadb-database-create" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923139 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde77bd7-58d5-4213-9793-7f8549cb4ff5" containerName="mariadb-database-create" Feb 20 16:50:17 crc kubenswrapper[4697]: E0220 16:50:17.923158 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72078d9-fd39-496f-b41d-ffb493c9bc14" containerName="mariadb-database-create" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923165 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72078d9-fd39-496f-b41d-ffb493c9bc14" containerName="mariadb-database-create" Feb 20 16:50:17 crc kubenswrapper[4697]: E0220 16:50:17.923183 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1be73ad-831a-4fab-9b34-646e75b34e01" containerName="mariadb-account-create-update" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923190 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1be73ad-831a-4fab-9b34-646e75b34e01" containerName="mariadb-account-create-update" Feb 20 16:50:17 crc kubenswrapper[4697]: E0220 16:50:17.923206 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdc7f7e-de85-48cb-b75d-969ff2d39d14" containerName="mariadb-database-create" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923213 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdc7f7e-de85-48cb-b75d-969ff2d39d14" containerName="mariadb-database-create" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923394 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c5e0bd-8650-46b9-a189-72bf5090b0f7" containerName="mariadb-account-create-update" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923417 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93ead78-caf8-4bda-a0be-ef377041fb5a" containerName="mariadb-database-create" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923433 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f66cbebc-f05e-4503-9e6e-8877ea8904fb" containerName="mariadb-account-create-update" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923465 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde77bd7-58d5-4213-9793-7f8549cb4ff5" containerName="mariadb-database-create" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923480 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72078d9-fd39-496f-b41d-ffb493c9bc14" containerName="mariadb-database-create" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923490 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1be73ad-831a-4fab-9b34-646e75b34e01" containerName="mariadb-account-create-update" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923501 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bdc7f7e-de85-48cb-b75d-969ff2d39d14" containerName="mariadb-database-create" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923511 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="222d06f8-6f3e-45ce-9acd-18c6d624edf4" containerName="mariadb-account-create-update" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.923521 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="a267ce98-60eb-4c3c-8906-de42f6872680" containerName="swift-ring-rebalance" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.925240 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.929113 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qzcv5" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.929688 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 20 16:50:17 crc kubenswrapper[4697]: I0220 16:50:17.936329 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wtqdz"] Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.002318 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ljxq" event={"ID":"765973da-6ef4-44c6-823c-5670b03c5900","Type":"ContainerStarted","Data":"189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977"} Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.006577 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-frqkk" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.007540 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-frqkk" event={"ID":"a267ce98-60eb-4c3c-8906-de42f6872680","Type":"ContainerDied","Data":"9b8afe3bceef71d5bae0158a10e522386a99b243bec099bd94cab57a333be726"} Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.007562 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b8afe3bceef71d5bae0158a10e522386a99b243bec099bd94cab57a333be726" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.028699 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-db-sync-config-data\") pod \"glance-db-sync-wtqdz\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.028747 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zw6l\" (UniqueName: \"kubernetes.io/projected/1c949e32-f57d-4f71-aaae-192d3ceea6de-kube-api-access-4zw6l\") pod \"glance-db-sync-wtqdz\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.028801 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-combined-ca-bundle\") pod \"glance-db-sync-wtqdz\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.028842 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-config-data\") pod \"glance-db-sync-wtqdz\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.034571 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.044370 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7ljxq" podStartSLOduration=3.266429015 podStartE2EDuration="5.044350844s" podCreationTimestamp="2026-02-20 16:50:13 +0000 UTC" firstStartedPulling="2026-02-20 16:50:14.922149534 +0000 UTC m=+1122.702194952" lastFinishedPulling="2026-02-20 16:50:16.700071363 +0000 UTC m=+1124.480116781" observedRunningTime="2026-02-20 16:50:18.037061515 +0000 UTC m=+1125.817106923" watchObservedRunningTime="2026-02-20 16:50:18.044350844 +0000 UTC m=+1125.824396252" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.130540 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-db-sync-config-data\") pod \"glance-db-sync-wtqdz\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.130622 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zw6l\" (UniqueName: \"kubernetes.io/projected/1c949e32-f57d-4f71-aaae-192d3ceea6de-kube-api-access-4zw6l\") pod \"glance-db-sync-wtqdz\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.130735 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-combined-ca-bundle\") pod \"glance-db-sync-wtqdz\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.130801 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-config-data\") pod \"glance-db-sync-wtqdz\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.137113 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-combined-ca-bundle\") pod \"glance-db-sync-wtqdz\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.137143 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-db-sync-config-data\") pod \"glance-db-sync-wtqdz\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.144230 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-config-data\") pod \"glance-db-sync-wtqdz\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.158111 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zw6l\" (UniqueName: \"kubernetes.io/projected/1c949e32-f57d-4f71-aaae-192d3ceea6de-kube-api-access-4zw6l\") pod \"glance-db-sync-wtqdz\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.191635 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-62kkp" podUID="aa3e6a99-a9a4-4578-94c6-8a4b641405ec" containerName="ovn-controller" probeResult="failure" output=< Feb 20 16:50:18 crc kubenswrapper[4697]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 20 16:50:18 crc kubenswrapper[4697]: > Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.262320 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.504120 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9gklb" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.638996 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrlwz\" (UniqueName: \"kubernetes.io/projected/3ee5893d-3486-4ec4-9bab-76c57cd02c74-kube-api-access-qrlwz\") pod \"3ee5893d-3486-4ec4-9bab-76c57cd02c74\" (UID: \"3ee5893d-3486-4ec4-9bab-76c57cd02c74\") " Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.639104 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ee5893d-3486-4ec4-9bab-76c57cd02c74-operator-scripts\") pod \"3ee5893d-3486-4ec4-9bab-76c57cd02c74\" (UID: \"3ee5893d-3486-4ec4-9bab-76c57cd02c74\") " Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.639928 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ee5893d-3486-4ec4-9bab-76c57cd02c74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ee5893d-3486-4ec4-9bab-76c57cd02c74" (UID: "3ee5893d-3486-4ec4-9bab-76c57cd02c74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.644037 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee5893d-3486-4ec4-9bab-76c57cd02c74-kube-api-access-qrlwz" (OuterVolumeSpecName: "kube-api-access-qrlwz") pod "3ee5893d-3486-4ec4-9bab-76c57cd02c74" (UID: "3ee5893d-3486-4ec4-9bab-76c57cd02c74"). InnerVolumeSpecName "kube-api-access-qrlwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.740612 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrlwz\" (UniqueName: \"kubernetes.io/projected/3ee5893d-3486-4ec4-9bab-76c57cd02c74-kube-api-access-qrlwz\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.740653 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ee5893d-3486-4ec4-9bab-76c57cd02c74-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:18 crc kubenswrapper[4697]: I0220 16:50:18.818280 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wtqdz"] Feb 20 16:50:19 crc kubenswrapper[4697]: I0220 16:50:19.015401 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"65330ad6a2add9d3aa81a38e81652759b1df655f9360c4ea5bc6ab9de3e6b652"} Feb 20 16:50:19 crc kubenswrapper[4697]: I0220 16:50:19.017258 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9gklb" event={"ID":"3ee5893d-3486-4ec4-9bab-76c57cd02c74","Type":"ContainerDied","Data":"bb1a04af436f6d0132605862df365bde0f46ce6485cef4b15d7758a5be6140f9"} Feb 20 16:50:19 crc kubenswrapper[4697]: I0220 16:50:19.017288 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb1a04af436f6d0132605862df365bde0f46ce6485cef4b15d7758a5be6140f9" Feb 20 16:50:19 crc kubenswrapper[4697]: I0220 16:50:19.017344 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9gklb" Feb 20 16:50:19 crc kubenswrapper[4697]: I0220 16:50:19.022690 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wtqdz" event={"ID":"1c949e32-f57d-4f71-aaae-192d3ceea6de","Type":"ContainerStarted","Data":"00a89dee06dfdbb6a84832e045b4fe0eae87684e8b92c22dba4da70bb4e25efa"} Feb 20 16:50:19 crc kubenswrapper[4697]: I0220 16:50:19.630893 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 16:50:20 crc kubenswrapper[4697]: I0220 16:50:20.040747 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"83f2b66b5ae472f64c6d8b483a31c879fca322e9e4508a9dc294106b436a8930"} Feb 20 16:50:20 crc kubenswrapper[4697]: I0220 16:50:20.040806 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"1478e4e310fe8c530cfaf2a0d7418d197f143bbf908ec6e3ae314966b9a6f451"} Feb 20 16:50:20 crc kubenswrapper[4697]: I0220 16:50:20.040820 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"f2f94ac3a42ce4bd99239dc0e20a287cf2e961ba8b7c79b76c4e8e59eb613054"} Feb 20 16:50:20 crc kubenswrapper[4697]: I0220 16:50:20.040956 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="prometheus" containerID="cri-o://0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6" gracePeriod=600 Feb 20 16:50:20 crc kubenswrapper[4697]: I0220 16:50:20.041007 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="config-reloader" containerID="cri-o://547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba" gracePeriod=600 Feb 20 16:50:20 crc kubenswrapper[4697]: I0220 16:50:20.041007 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="thanos-sidecar" containerID="cri-o://c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123" gracePeriod=600 Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.025280 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.040477 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9gklb"] Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.044591 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9gklb"] Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.060700 4697 generic.go:334] "Generic (PLEG): container finished" podID="40ca67b4-1eb6-40a6-ad33-1982ed83eb63" containerID="0cc80e2fc351be7479162c4f999b086841bc3cf0512d270e2d2a9e8c622f67e6" exitCode=0 Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.060762 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"40ca67b4-1eb6-40a6-ad33-1982ed83eb63","Type":"ContainerDied","Data":"0cc80e2fc351be7479162c4f999b086841bc3cf0512d270e2d2a9e8c622f67e6"} Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.066055 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"718086e07d0b2539fa24fd1a1c69b873d5497913f63168e4449a9dff874d6b87"} Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.067796 4697 generic.go:334] "Generic (PLEG): container finished" podID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerID="c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123" exitCode=0 Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.067815 4697 generic.go:334] "Generic (PLEG): container finished" podID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerID="547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba" exitCode=0 Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.067823 4697 generic.go:334] "Generic (PLEG): container finished" podID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerID="0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6" exitCode=0 Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.067850 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c08b5df3-217d-41d0-b021-d29a0b7e7dd2","Type":"ContainerDied","Data":"c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123"} Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.067865 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c08b5df3-217d-41d0-b021-d29a0b7e7dd2","Type":"ContainerDied","Data":"547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba"} Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.067874 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c08b5df3-217d-41d0-b021-d29a0b7e7dd2","Type":"ContainerDied","Data":"0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6"} Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.067882 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c08b5df3-217d-41d0-b021-d29a0b7e7dd2","Type":"ContainerDied","Data":"6f6d21fccb187ea83597da718508115bf7743842b00388bdb3c4e037695256dd"} Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.067898 4697 scope.go:117] "RemoveContainer" containerID="c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.068002 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.072046 4697 generic.go:334] "Generic (PLEG): container finished" podID="5224cc9f-d610-4ea0-94da-11cdb019dcce" containerID="3f293a48d38b7b781163e876ccf67bbca965ff9663c9453e12702b566695a6d3" exitCode=0 Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.072119 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5224cc9f-d610-4ea0-94da-11cdb019dcce","Type":"ContainerDied","Data":"3f293a48d38b7b781163e876ccf67bbca965ff9663c9453e12702b566695a6d3"} Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.076102 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-config-out\") pod \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.076137 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-1\") pod \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.076189 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-0\") pod \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.076220 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw2pz\" (UniqueName: \"kubernetes.io/projected/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-kube-api-access-lw2pz\") pod \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.076341 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.076383 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-2\") pod \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.076402 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-thanos-prometheus-http-client-file\") pod \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.076420 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-config\") pod \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.076514 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-tls-assets\") pod \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.076545 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-web-config\") pod \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\" (UID: \"c08b5df3-217d-41d0-b021-d29a0b7e7dd2\") " Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.078078 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "c08b5df3-217d-41d0-b021-d29a0b7e7dd2" (UID: "c08b5df3-217d-41d0-b021-d29a0b7e7dd2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.078390 4697 generic.go:334] "Generic (PLEG): container finished" podID="591f7e7d-78bf-43a5-afe2-119f93765311" containerID="e8d6ccf2b09ae17c606154f3a9a7311ffe59e59a1a51f1eaf9fdbb26063850a3" exitCode=0 Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.078429 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"591f7e7d-78bf-43a5-afe2-119f93765311","Type":"ContainerDied","Data":"e8d6ccf2b09ae17c606154f3a9a7311ffe59e59a1a51f1eaf9fdbb26063850a3"} Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.082593 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "c08b5df3-217d-41d0-b021-d29a0b7e7dd2" (UID: "c08b5df3-217d-41d0-b021-d29a0b7e7dd2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.082614 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-config" (OuterVolumeSpecName: "config") pod "c08b5df3-217d-41d0-b021-d29a0b7e7dd2" (UID: "c08b5df3-217d-41d0-b021-d29a0b7e7dd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.084595 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-kube-api-access-lw2pz" (OuterVolumeSpecName: "kube-api-access-lw2pz") pod "c08b5df3-217d-41d0-b021-d29a0b7e7dd2" (UID: "c08b5df3-217d-41d0-b021-d29a0b7e7dd2"). InnerVolumeSpecName "kube-api-access-lw2pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.084759 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-config-out" (OuterVolumeSpecName: "config-out") pod "c08b5df3-217d-41d0-b021-d29a0b7e7dd2" (UID: "c08b5df3-217d-41d0-b021-d29a0b7e7dd2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.085656 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c08b5df3-217d-41d0-b021-d29a0b7e7dd2" (UID: "c08b5df3-217d-41d0-b021-d29a0b7e7dd2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.086388 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "c08b5df3-217d-41d0-b021-d29a0b7e7dd2" (UID: "c08b5df3-217d-41d0-b021-d29a0b7e7dd2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.094121 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c08b5df3-217d-41d0-b021-d29a0b7e7dd2" (UID: "c08b5df3-217d-41d0-b021-d29a0b7e7dd2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.103728 4697 scope.go:117] "RemoveContainer" containerID="547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.118984 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "c08b5df3-217d-41d0-b021-d29a0b7e7dd2" (UID: "c08b5df3-217d-41d0-b021-d29a0b7e7dd2"). InnerVolumeSpecName "pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.121402 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-web-config" (OuterVolumeSpecName: "web-config") pod "c08b5df3-217d-41d0-b021-d29a0b7e7dd2" (UID: "c08b5df3-217d-41d0-b021-d29a0b7e7dd2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.125911 4697 scope.go:117] "RemoveContainer" containerID="0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.179647 4697 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.179688 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw2pz\" (UniqueName: \"kubernetes.io/projected/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-kube-api-access-lw2pz\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.179719 4697 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") on node \"crc\" " Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.179730 4697 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.179762 4697 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.179772 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.179781 4697 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.179791 4697 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-web-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.179802 4697 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-config-out\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.179811 4697 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c08b5df3-217d-41d0-b021-d29a0b7e7dd2-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.205723 4697 scope.go:117] "RemoveContainer" containerID="c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.238472 4697 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.238917 4697 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0") on node "crc" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.275193 4697 scope.go:117] "RemoveContainer" containerID="c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123" Feb 20 16:50:21 crc kubenswrapper[4697]: E0220 16:50:21.275832 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123\": container with ID starting with c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123 not found: ID does not exist" containerID="c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.275878 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123"} err="failed to get container status \"c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123\": rpc error: code = NotFound desc = could not find container \"c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123\": container with ID starting with c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123 not found: ID does not exist" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.275897 4697 scope.go:117] "RemoveContainer" containerID="547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba" Feb 20 16:50:21 crc kubenswrapper[4697]: E0220 16:50:21.276184 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba\": container with ID starting with 547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba not found: ID does not exist" containerID="547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.276241 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba"} err="failed to get container status \"547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba\": rpc error: code = NotFound desc = could not find container \"547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba\": container with ID starting with 547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba not found: ID does not exist" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.276277 4697 scope.go:117] "RemoveContainer" containerID="0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6" Feb 20 16:50:21 crc kubenswrapper[4697]: E0220 16:50:21.277877 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6\": container with ID starting with 0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6 not found: ID does not exist" containerID="0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.277917 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6"} err="failed to get container status \"0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6\": rpc error: code = NotFound desc = could not find container \"0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6\": container with ID starting with 0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6 not found: ID does not exist" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.277943 4697 scope.go:117] "RemoveContainer" containerID="c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e" Feb 20 16:50:21 crc kubenswrapper[4697]: E0220 16:50:21.278241 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e\": container with ID starting with c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e not found: ID does not exist" containerID="c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.278291 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e"} err="failed to get container status \"c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e\": rpc error: code = NotFound desc = could not find container \"c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e\": container with ID starting with c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e not found: ID does not exist" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.278324 4697 scope.go:117] "RemoveContainer" containerID="c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.279084 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123"} err="failed to get container status \"c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123\": rpc error: code = NotFound desc = could not find container \"c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123\": container with ID starting with c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123 not found: ID does not exist" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.279104 4697 scope.go:117] "RemoveContainer" containerID="547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.280537 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba"} err="failed to get container status \"547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba\": rpc error: code = NotFound desc = could not find container \"547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba\": container with ID starting with 547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba not found: ID does not exist" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.280567 4697 scope.go:117] "RemoveContainer" containerID="0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.280827 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6"} err="failed to get container status \"0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6\": rpc error: code = NotFound desc = could not find container \"0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6\": container with ID starting with 0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6 not found: ID does not exist" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.280857 4697 scope.go:117] "RemoveContainer" containerID="c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.281117 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e"} err="failed to get container status \"c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e\": rpc error: code = NotFound desc = could not find container \"c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e\": container with ID starting with c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e not found: ID does not exist" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.281131 4697 scope.go:117] "RemoveContainer" containerID="c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.281421 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123"} err="failed to get container status \"c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123\": rpc error: code = NotFound desc = could not find container \"c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123\": container with ID starting with c28154d975f591dc45ba44a483f26c84776a2dd6622337ccb2375e14cb2d7123 not found: ID does not exist" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.281455 4697 scope.go:117] "RemoveContainer" containerID="547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.281486 4697 reconciler_common.go:293] "Volume detached for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.281660 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba"} err="failed to get container status \"547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba\": rpc error: code = NotFound desc = could not find container \"547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba\": container with ID starting with 547c32697112cd6f823fb729cab4bb879d29eaedb92cf384790c38a8a5d106ba not found: ID does not exist" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.281692 4697 scope.go:117] "RemoveContainer" containerID="0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.281890 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6"} err="failed to get container status \"0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6\": rpc error: code = NotFound desc = could not find container \"0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6\": container with ID starting with 0a6176d01e7160c33420b8387d1c523bb748fca6e8b0c0513deae9125c3300f6 not found: ID does not exist" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.281917 4697 scope.go:117] "RemoveContainer" containerID="c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.282094 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e"} err="failed to get container status \"c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e\": rpc error: code = NotFound desc = could not find container \"c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e\": container with ID starting with c6e4baee3e5ce45bda26ed421bd7dd9af8c7f454e0eea139e0ff793facbf5f7e not found: ID does not exist" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.444292 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.455691 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.475431 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 16:50:21 crc kubenswrapper[4697]: E0220 16:50:21.476835 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="config-reloader" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.476858 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="config-reloader" Feb 20 16:50:21 crc kubenswrapper[4697]: E0220 16:50:21.476870 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee5893d-3486-4ec4-9bab-76c57cd02c74" containerName="mariadb-account-create-update" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.476876 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee5893d-3486-4ec4-9bab-76c57cd02c74" containerName="mariadb-account-create-update" Feb 20 16:50:21 crc kubenswrapper[4697]: E0220 16:50:21.476893 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="init-config-reloader" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.476901 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="init-config-reloader" Feb 20 16:50:21 crc kubenswrapper[4697]: E0220 16:50:21.476920 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="thanos-sidecar" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.476926 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="thanos-sidecar" Feb 20 16:50:21 crc kubenswrapper[4697]: E0220 16:50:21.476935 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="prometheus" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.476940 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="prometheus" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.477101 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="config-reloader" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.477117 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee5893d-3486-4ec4-9bab-76c57cd02c74" containerName="mariadb-account-create-update" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.477130 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="thanos-sidecar" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.477139 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" containerName="prometheus" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.478520 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.482731 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.482878 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.482904 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.482971 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.483059 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-tnj5w" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.483311 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.483447 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.487296 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.491205 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.514863 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.585586 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23464f44-ddc9-4b6e-8e53-6196d0136cc0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.585638 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.585662 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.585698 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.585784 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-config\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.585826 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.585856 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9vv9\" (UniqueName: \"kubernetes.io/projected/23464f44-ddc9-4b6e-8e53-6196d0136cc0-kube-api-access-h9vv9\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.585885 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23464f44-ddc9-4b6e-8e53-6196d0136cc0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.585914 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.585948 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.585981 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.586050 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.586083 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687052 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687097 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687133 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687164 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-config\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687186 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687206 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9vv9\" (UniqueName: \"kubernetes.io/projected/23464f44-ddc9-4b6e-8e53-6196d0136cc0-kube-api-access-h9vv9\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687226 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23464f44-ddc9-4b6e-8e53-6196d0136cc0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687252 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687273 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687300 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687334 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687374 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.687405 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23464f44-ddc9-4b6e-8e53-6196d0136cc0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.689596 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.690198 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.690381 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.691108 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.693285 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23464f44-ddc9-4b6e-8e53-6196d0136cc0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.693599 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-config\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.693844 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.693974 4697 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.693999 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ab999197bb553d62566307ca20e48f871152403b8e1c643d4e6f778eae279956/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.697242 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.697842 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.698023 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23464f44-ddc9-4b6e-8e53-6196d0136cc0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.710719 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.711287 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9vv9\" (UniqueName: \"kubernetes.io/projected/23464f44-ddc9-4b6e-8e53-6196d0136cc0-kube-api-access-h9vv9\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.744301 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"prometheus-metric-storage-0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:21 crc kubenswrapper[4697]: I0220 16:50:21.801025 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.188738 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5224cc9f-d610-4ea0-94da-11cdb019dcce","Type":"ContainerStarted","Data":"b97e258cdd1189fb417c1ab6d7c7def29ec543522f72b6a334728b99dc9911f0"} Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.190122 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.212637 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"591f7e7d-78bf-43a5-afe2-119f93765311","Type":"ContainerStarted","Data":"ea1d392cd1fa9d376f73631a5fe65996e572651204044c665beccb42ef4bda4d"} Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.213430 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.237722 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.830429334 podStartE2EDuration="1m0.237705338s" podCreationTimestamp="2026-02-20 16:49:22 +0000 UTC" firstStartedPulling="2026-02-20 16:49:35.402999303 +0000 UTC m=+1083.183044711" lastFinishedPulling="2026-02-20 16:49:42.810275307 +0000 UTC m=+1090.590320715" observedRunningTime="2026-02-20 16:50:22.234360865 +0000 UTC m=+1130.014406273" watchObservedRunningTime="2026-02-20 16:50:22.237705338 +0000 UTC m=+1130.017750746" Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.249867 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"40ca67b4-1eb6-40a6-ad33-1982ed83eb63","Type":"ContainerStarted","Data":"55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf"} Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.250593 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.273863 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"e69ce4903cbc932c07a2e153c4970681cd2f42f3d2d3c2fc009340056896a4ad"} Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.273903 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"d5f2ccf15ab081f4e70712d3e1d692cc410fe63ca81e214e3570f999ce41ab06"} Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.273914 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"1083caeeb3ac49a4419cc7b7fc9507c42d9452ac84868e2dd19f1a63faa6a6fe"} Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.277564 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.299987 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/notifications-rabbitmq-server-0" podStartSLOduration=51.659443634 podStartE2EDuration="1m0.299966716s" podCreationTimestamp="2026-02-20 16:49:22 +0000 UTC" firstStartedPulling="2026-02-20 16:49:34.572528612 +0000 UTC m=+1082.352574020" lastFinishedPulling="2026-02-20 16:49:43.213051694 +0000 UTC m=+1090.993097102" observedRunningTime="2026-02-20 16:50:22.291869798 +0000 UTC m=+1130.071915206" watchObservedRunningTime="2026-02-20 16:50:22.299966716 +0000 UTC m=+1130.080012124" Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.333200 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.006726467 podStartE2EDuration="59.333178302s" podCreationTimestamp="2026-02-20 16:49:23 +0000 UTC" firstStartedPulling="2026-02-20 16:49:35.483483983 +0000 UTC m=+1083.263529391" lastFinishedPulling="2026-02-20 16:49:42.809935818 +0000 UTC m=+1090.589981226" observedRunningTime="2026-02-20 16:50:22.323447073 +0000 UTC m=+1130.103492481" watchObservedRunningTime="2026-02-20 16:50:22.333178302 +0000 UTC m=+1130.113223700" Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.893337 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee5893d-3486-4ec4-9bab-76c57cd02c74" path="/var/lib/kubelet/pods/3ee5893d-3486-4ec4-9bab-76c57cd02c74/volumes" Feb 20 16:50:22 crc kubenswrapper[4697]: I0220 16:50:22.893966 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c08b5df3-217d-41d0-b021-d29a0b7e7dd2" path="/var/lib/kubelet/pods/c08b5df3-217d-41d0-b021-d29a0b7e7dd2/volumes" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.206497 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-62kkp" podUID="aa3e6a99-a9a4-4578-94c6-8a4b641405ec" containerName="ovn-controller" probeResult="failure" output=< Feb 20 16:50:23 crc kubenswrapper[4697]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 20 16:50:23 crc kubenswrapper[4697]: > Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.303040 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.324230 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7wh5h" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.337415 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"3d52c48d52318cf633913ed8b77c8265eb2a50069be3678f50db7c5e87f7e1f2"} Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.342566 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23464f44-ddc9-4b6e-8e53-6196d0136cc0","Type":"ContainerStarted","Data":"6c9e20c7fcb3e218e91783aeb1fc75df8aeab9083f9bae04ed9074507666312d"} Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.397935 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.398918 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.467220 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.580183 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-62kkp-config-ndrvk"] Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.581513 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.583790 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.598148 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-62kkp-config-ndrvk"] Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.728639 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbw98\" (UniqueName: \"kubernetes.io/projected/fc77ee14-633c-435e-8169-9ac861f0b228-kube-api-access-jbw98\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.728984 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-run\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.729019 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-log-ovn\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.729045 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc77ee14-633c-435e-8169-9ac861f0b228-scripts\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.729205 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-run-ovn\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.729434 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fc77ee14-633c-435e-8169-9ac861f0b228-additional-scripts\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.831106 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-run-ovn\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.831161 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fc77ee14-633c-435e-8169-9ac861f0b228-additional-scripts\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.831225 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbw98\" (UniqueName: \"kubernetes.io/projected/fc77ee14-633c-435e-8169-9ac861f0b228-kube-api-access-jbw98\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.831259 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-run\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.831291 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-log-ovn\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.831315 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc77ee14-633c-435e-8169-9ac861f0b228-scripts\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.831495 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-run-ovn\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.831565 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-run\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.831691 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-log-ovn\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.832100 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fc77ee14-633c-435e-8169-9ac861f0b228-additional-scripts\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.833862 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc77ee14-633c-435e-8169-9ac861f0b228-scripts\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.847152 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbw98\" (UniqueName: \"kubernetes.io/projected/fc77ee14-633c-435e-8169-9ac861f0b228-kube-api-access-jbw98\") pod \"ovn-controller-62kkp-config-ndrvk\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:23 crc kubenswrapper[4697]: I0220 16:50:23.895340 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:24 crc kubenswrapper[4697]: I0220 16:50:24.452708 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-62kkp-config-ndrvk"] Feb 20 16:50:24 crc kubenswrapper[4697]: I0220 16:50:24.457537 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"366ae25108c57fd77b7ed0611ebc8c1f979af0b0e1d6733acc2564b00a4db6c7"} Feb 20 16:50:24 crc kubenswrapper[4697]: I0220 16:50:24.457578 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"a4c876824b459859a50149e4fb62921ed1b4377a38e63e2f6b41d9870d823d69"} Feb 20 16:50:24 crc kubenswrapper[4697]: I0220 16:50:24.457589 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"3158397e3418fc9d5231574c564f357d51e52140947d36fe9dff44fc96195f67"} Feb 20 16:50:24 crc kubenswrapper[4697]: I0220 16:50:24.457597 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"f114e34991614b0b108d858729447a1c2b5cd6bd53b7a28ff9aa3078b76e582a"} Feb 20 16:50:24 crc kubenswrapper[4697]: I0220 16:50:24.457606 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"6bf4cb203f202db8165d9f6a7cc0ac82bd2e86a7324358c5e7843fa08270ff4e"} Feb 20 16:50:24 crc kubenswrapper[4697]: W0220 16:50:24.466405 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc77ee14_633c_435e_8169_9ac861f0b228.slice/crio-6519bf86a28165f47caa14f77bc25f9d46c7004bfc212aec3f3d621293304013 WatchSource:0}: Error finding container 6519bf86a28165f47caa14f77bc25f9d46c7004bfc212aec3f3d621293304013: Status 404 returned error can't find the container with id 6519bf86a28165f47caa14f77bc25f9d46c7004bfc212aec3f3d621293304013 Feb 20 16:50:24 crc kubenswrapper[4697]: I0220 16:50:24.527607 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:24 crc kubenswrapper[4697]: I0220 16:50:24.585779 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ljxq"] Feb 20 16:50:25 crc kubenswrapper[4697]: I0220 16:50:25.469488 4697 generic.go:334] "Generic (PLEG): container finished" podID="fc77ee14-633c-435e-8169-9ac861f0b228" containerID="6ddc72e4e05a1c71fab5df4cf5556bcf58783c78bdbb538457733b21d00a56f3" exitCode=0 Feb 20 16:50:25 crc kubenswrapper[4697]: I0220 16:50:25.469667 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-62kkp-config-ndrvk" event={"ID":"fc77ee14-633c-435e-8169-9ac861f0b228","Type":"ContainerDied","Data":"6ddc72e4e05a1c71fab5df4cf5556bcf58783c78bdbb538457733b21d00a56f3"} Feb 20 16:50:25 crc kubenswrapper[4697]: I0220 16:50:25.469871 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-62kkp-config-ndrvk" event={"ID":"fc77ee14-633c-435e-8169-9ac861f0b228","Type":"ContainerStarted","Data":"6519bf86a28165f47caa14f77bc25f9d46c7004bfc212aec3f3d621293304013"} Feb 20 16:50:25 crc kubenswrapper[4697]: I0220 16:50:25.477853 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"7cc99a836952bab3f6f21099b16472aa366578781c979aea9a6b384238c0732f"} Feb 20 16:50:25 crc kubenswrapper[4697]: I0220 16:50:25.478151 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8a8a227a-2c59-4ecd-a4c3-69c9018f1c13","Type":"ContainerStarted","Data":"ecdfa15c76e537af7458a877e420e03933112d05388f800179d2366e668316ea"} Feb 20 16:50:25 crc kubenswrapper[4697]: I0220 16:50:25.479774 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23464f44-ddc9-4b6e-8e53-6196d0136cc0","Type":"ContainerStarted","Data":"17f802dae38b06b5b55a2659fd54e20ccd1b40172d6d4e186b3bd93ec13cba6d"} Feb 20 16:50:25 crc kubenswrapper[4697]: I0220 16:50:25.555129 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.436402284 podStartE2EDuration="25.555112381s" podCreationTimestamp="2026-02-20 16:50:00 +0000 UTC" firstStartedPulling="2026-02-20 16:50:18.046775323 +0000 UTC m=+1125.826820731" lastFinishedPulling="2026-02-20 16:50:23.16548541 +0000 UTC m=+1130.945530828" observedRunningTime="2026-02-20 16:50:25.549246627 +0000 UTC m=+1133.329292055" watchObservedRunningTime="2026-02-20 16:50:25.555112381 +0000 UTC m=+1133.335157799" Feb 20 16:50:25 crc kubenswrapper[4697]: I0220 16:50:25.911585 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8f7465f5f-wx7p7"] Feb 20 16:50:25 crc kubenswrapper[4697]: I0220 16:50:25.915093 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:25 crc kubenswrapper[4697]: I0220 16:50:25.919182 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 20 16:50:25 crc kubenswrapper[4697]: I0220 16:50:25.921986 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8f7465f5f-wx7p7"] Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.040060 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-mxxjc"] Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.041126 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mxxjc" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.043619 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.056113 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mxxjc"] Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.076339 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xtpb\" (UniqueName: \"kubernetes.io/projected/ed52949a-bfb6-4102-bd9f-9ea591aeb295-kube-api-access-9xtpb\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.076404 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-dns-svc\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.076426 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-ovsdbserver-sb\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.076473 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-dns-swift-storage-0\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.076495 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-config\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.076538 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-ovsdbserver-nb\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.178505 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-dns-svc\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.179290 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-dns-svc\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.179351 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq5lm\" (UniqueName: \"kubernetes.io/projected/28e7c55b-4906-4b63-94ac-5503110f6c8f-kube-api-access-qq5lm\") pod \"root-account-create-update-mxxjc\" (UID: \"28e7c55b-4906-4b63-94ac-5503110f6c8f\") " pod="openstack/root-account-create-update-mxxjc" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.179374 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-ovsdbserver-sb\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.179411 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-dns-swift-storage-0\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.179452 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-config\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.179507 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-ovsdbserver-nb\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.179532 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28e7c55b-4906-4b63-94ac-5503110f6c8f-operator-scripts\") pod \"root-account-create-update-mxxjc\" (UID: \"28e7c55b-4906-4b63-94ac-5503110f6c8f\") " pod="openstack/root-account-create-update-mxxjc" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.179594 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xtpb\" (UniqueName: \"kubernetes.io/projected/ed52949a-bfb6-4102-bd9f-9ea591aeb295-kube-api-access-9xtpb\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.180456 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-ovsdbserver-sb\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.180956 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-dns-swift-storage-0\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.181581 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-config\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.182152 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-ovsdbserver-nb\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.202595 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xtpb\" (UniqueName: \"kubernetes.io/projected/ed52949a-bfb6-4102-bd9f-9ea591aeb295-kube-api-access-9xtpb\") pod \"dnsmasq-dns-8f7465f5f-wx7p7\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.235652 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.280813 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28e7c55b-4906-4b63-94ac-5503110f6c8f-operator-scripts\") pod \"root-account-create-update-mxxjc\" (UID: \"28e7c55b-4906-4b63-94ac-5503110f6c8f\") " pod="openstack/root-account-create-update-mxxjc" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.281207 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq5lm\" (UniqueName: \"kubernetes.io/projected/28e7c55b-4906-4b63-94ac-5503110f6c8f-kube-api-access-qq5lm\") pod \"root-account-create-update-mxxjc\" (UID: \"28e7c55b-4906-4b63-94ac-5503110f6c8f\") " pod="openstack/root-account-create-update-mxxjc" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.282101 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28e7c55b-4906-4b63-94ac-5503110f6c8f-operator-scripts\") pod \"root-account-create-update-mxxjc\" (UID: \"28e7c55b-4906-4b63-94ac-5503110f6c8f\") " pod="openstack/root-account-create-update-mxxjc" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.300886 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq5lm\" (UniqueName: \"kubernetes.io/projected/28e7c55b-4906-4b63-94ac-5503110f6c8f-kube-api-access-qq5lm\") pod \"root-account-create-update-mxxjc\" (UID: \"28e7c55b-4906-4b63-94ac-5503110f6c8f\") " pod="openstack/root-account-create-update-mxxjc" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.357592 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mxxjc" Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.493678 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7ljxq" podUID="765973da-6ef4-44c6-823c-5670b03c5900" containerName="registry-server" containerID="cri-o://189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977" gracePeriod=2 Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.718042 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8f7465f5f-wx7p7"] Feb 20 16:50:26 crc kubenswrapper[4697]: W0220 16:50:26.723336 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded52949a_bfb6_4102_bd9f_9ea591aeb295.slice/crio-7c17e45fb4317b3a9f37e3c07b68782043312bc2ccf550ffc9c9b2316c3bfa84 WatchSource:0}: Error finding container 7c17e45fb4317b3a9f37e3c07b68782043312bc2ccf550ffc9c9b2316c3bfa84: Status 404 returned error can't find the container with id 7c17e45fb4317b3a9f37e3c07b68782043312bc2ccf550ffc9c9b2316c3bfa84 Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.876654 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mxxjc"] Feb 20 16:50:26 crc kubenswrapper[4697]: W0220 16:50:26.887383 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28e7c55b_4906_4b63_94ac_5503110f6c8f.slice/crio-5b4f1c2b906a060ed60aa5ba49ab03789ba2ad8112e8885106139a1ec666ebb1 WatchSource:0}: Error finding container 5b4f1c2b906a060ed60aa5ba49ab03789ba2ad8112e8885106139a1ec666ebb1: Status 404 returned error can't find the container with id 5b4f1c2b906a060ed60aa5ba49ab03789ba2ad8112e8885106139a1ec666ebb1 Feb 20 16:50:26 crc kubenswrapper[4697]: I0220 16:50:26.954917 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.015579 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.101263 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fc77ee14-633c-435e-8169-9ac861f0b228-additional-scripts\") pod \"fc77ee14-633c-435e-8169-9ac861f0b228\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.101385 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-run-ovn\") pod \"fc77ee14-633c-435e-8169-9ac861f0b228\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.101455 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc77ee14-633c-435e-8169-9ac861f0b228-scripts\") pod \"fc77ee14-633c-435e-8169-9ac861f0b228\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.101495 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-run\") pod \"fc77ee14-633c-435e-8169-9ac861f0b228\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.101560 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2c8j\" (UniqueName: \"kubernetes.io/projected/765973da-6ef4-44c6-823c-5670b03c5900-kube-api-access-n2c8j\") pod \"765973da-6ef4-44c6-823c-5670b03c5900\" (UID: \"765973da-6ef4-44c6-823c-5670b03c5900\") " Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.101610 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-log-ovn\") pod \"fc77ee14-633c-435e-8169-9ac861f0b228\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.101641 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbw98\" (UniqueName: \"kubernetes.io/projected/fc77ee14-633c-435e-8169-9ac861f0b228-kube-api-access-jbw98\") pod \"fc77ee14-633c-435e-8169-9ac861f0b228\" (UID: \"fc77ee14-633c-435e-8169-9ac861f0b228\") " Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.101681 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765973da-6ef4-44c6-823c-5670b03c5900-catalog-content\") pod \"765973da-6ef4-44c6-823c-5670b03c5900\" (UID: \"765973da-6ef4-44c6-823c-5670b03c5900\") " Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.101702 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765973da-6ef4-44c6-823c-5670b03c5900-utilities\") pod \"765973da-6ef4-44c6-823c-5670b03c5900\" (UID: \"765973da-6ef4-44c6-823c-5670b03c5900\") " Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.101489 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fc77ee14-633c-435e-8169-9ac861f0b228" (UID: "fc77ee14-633c-435e-8169-9ac861f0b228"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.102082 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-run" (OuterVolumeSpecName: "var-run") pod "fc77ee14-633c-435e-8169-9ac861f0b228" (UID: "fc77ee14-633c-435e-8169-9ac861f0b228"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.102310 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fc77ee14-633c-435e-8169-9ac861f0b228" (UID: "fc77ee14-633c-435e-8169-9ac861f0b228"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.102850 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/765973da-6ef4-44c6-823c-5670b03c5900-utilities" (OuterVolumeSpecName: "utilities") pod "765973da-6ef4-44c6-823c-5670b03c5900" (UID: "765973da-6ef4-44c6-823c-5670b03c5900"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.102958 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc77ee14-633c-435e-8169-9ac861f0b228-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "fc77ee14-633c-435e-8169-9ac861f0b228" (UID: "fc77ee14-633c-435e-8169-9ac861f0b228"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.103123 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc77ee14-633c-435e-8169-9ac861f0b228-scripts" (OuterVolumeSpecName: "scripts") pod "fc77ee14-633c-435e-8169-9ac861f0b228" (UID: "fc77ee14-633c-435e-8169-9ac861f0b228"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.107258 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/765973da-6ef4-44c6-823c-5670b03c5900-kube-api-access-n2c8j" (OuterVolumeSpecName: "kube-api-access-n2c8j") pod "765973da-6ef4-44c6-823c-5670b03c5900" (UID: "765973da-6ef4-44c6-823c-5670b03c5900"). InnerVolumeSpecName "kube-api-access-n2c8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.107369 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc77ee14-633c-435e-8169-9ac861f0b228-kube-api-access-jbw98" (OuterVolumeSpecName: "kube-api-access-jbw98") pod "fc77ee14-633c-435e-8169-9ac861f0b228" (UID: "fc77ee14-633c-435e-8169-9ac861f0b228"). InnerVolumeSpecName "kube-api-access-jbw98". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.124516 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/765973da-6ef4-44c6-823c-5670b03c5900-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "765973da-6ef4-44c6-823c-5670b03c5900" (UID: "765973da-6ef4-44c6-823c-5670b03c5900"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.204373 4697 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fc77ee14-633c-435e-8169-9ac861f0b228-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.204411 4697 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.204420 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc77ee14-633c-435e-8169-9ac861f0b228-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.204546 4697 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.204557 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2c8j\" (UniqueName: \"kubernetes.io/projected/765973da-6ef4-44c6-823c-5670b03c5900-kube-api-access-n2c8j\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.204567 4697 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc77ee14-633c-435e-8169-9ac861f0b228-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.204576 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbw98\" (UniqueName: \"kubernetes.io/projected/fc77ee14-633c-435e-8169-9ac861f0b228-kube-api-access-jbw98\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.204585 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765973da-6ef4-44c6-823c-5670b03c5900-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.204593 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765973da-6ef4-44c6-823c-5670b03c5900-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.520721 4697 generic.go:334] "Generic (PLEG): container finished" podID="28e7c55b-4906-4b63-94ac-5503110f6c8f" containerID="769a09d472a35b799718765a4a9a09b190b50e8a33ffc695bb527e5467b54067" exitCode=0 Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.520808 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mxxjc" event={"ID":"28e7c55b-4906-4b63-94ac-5503110f6c8f","Type":"ContainerDied","Data":"769a09d472a35b799718765a4a9a09b190b50e8a33ffc695bb527e5467b54067"} Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.520910 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mxxjc" event={"ID":"28e7c55b-4906-4b63-94ac-5503110f6c8f","Type":"ContainerStarted","Data":"5b4f1c2b906a060ed60aa5ba49ab03789ba2ad8112e8885106139a1ec666ebb1"} Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.531843 4697 generic.go:334] "Generic (PLEG): container finished" podID="765973da-6ef4-44c6-823c-5670b03c5900" containerID="189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977" exitCode=0 Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.531975 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ljxq" event={"ID":"765973da-6ef4-44c6-823c-5670b03c5900","Type":"ContainerDied","Data":"189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977"} Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.532010 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ljxq" event={"ID":"765973da-6ef4-44c6-823c-5670b03c5900","Type":"ContainerDied","Data":"534d03c3ada0b52b989e9f63079bba70eb48846c3ac3385b668cd0baeef99f27"} Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.532029 4697 scope.go:117] "RemoveContainer" containerID="189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.532248 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ljxq" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.538950 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-62kkp-config-ndrvk" event={"ID":"fc77ee14-633c-435e-8169-9ac861f0b228","Type":"ContainerDied","Data":"6519bf86a28165f47caa14f77bc25f9d46c7004bfc212aec3f3d621293304013"} Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.538990 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6519bf86a28165f47caa14f77bc25f9d46c7004bfc212aec3f3d621293304013" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.539046 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-62kkp-config-ndrvk" Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.540503 4697 generic.go:334] "Generic (PLEG): container finished" podID="ed52949a-bfb6-4102-bd9f-9ea591aeb295" containerID="5d59703e58aed3ea649faa9d298e8b65da7de8f6f3df345447fa59fa396a07df" exitCode=0 Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.540555 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" event={"ID":"ed52949a-bfb6-4102-bd9f-9ea591aeb295","Type":"ContainerDied","Data":"5d59703e58aed3ea649faa9d298e8b65da7de8f6f3df345447fa59fa396a07df"} Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.540569 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" event={"ID":"ed52949a-bfb6-4102-bd9f-9ea591aeb295","Type":"ContainerStarted","Data":"7c17e45fb4317b3a9f37e3c07b68782043312bc2ccf550ffc9c9b2316c3bfa84"} Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.601100 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ljxq"] Feb 20 16:50:27 crc kubenswrapper[4697]: I0220 16:50:27.610246 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ljxq"] Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.047084 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-62kkp-config-ndrvk"] Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.055448 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-62kkp-config-ndrvk"] Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.086326 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-62kkp-config-qgcdl"] Feb 20 16:50:28 crc kubenswrapper[4697]: E0220 16:50:28.087229 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765973da-6ef4-44c6-823c-5670b03c5900" containerName="extract-content" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.087246 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="765973da-6ef4-44c6-823c-5670b03c5900" containerName="extract-content" Feb 20 16:50:28 crc kubenswrapper[4697]: E0220 16:50:28.087260 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc77ee14-633c-435e-8169-9ac861f0b228" containerName="ovn-config" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.087266 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc77ee14-633c-435e-8169-9ac861f0b228" containerName="ovn-config" Feb 20 16:50:28 crc kubenswrapper[4697]: E0220 16:50:28.087296 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765973da-6ef4-44c6-823c-5670b03c5900" containerName="registry-server" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.087304 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="765973da-6ef4-44c6-823c-5670b03c5900" containerName="registry-server" Feb 20 16:50:28 crc kubenswrapper[4697]: E0220 16:50:28.087311 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765973da-6ef4-44c6-823c-5670b03c5900" containerName="extract-utilities" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.087317 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="765973da-6ef4-44c6-823c-5670b03c5900" containerName="extract-utilities" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.087502 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="765973da-6ef4-44c6-823c-5670b03c5900" containerName="registry-server" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.087525 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc77ee14-633c-435e-8169-9ac861f0b228" containerName="ovn-config" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.088131 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.099158 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.100777 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-62kkp-config-qgcdl"] Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.181928 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-62kkp" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.234268 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-additional-scripts\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.234361 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-run\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.234454 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-run-ovn\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.234516 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-scripts\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.234621 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-log-ovn\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.234671 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8ftj\" (UniqueName: \"kubernetes.io/projected/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-kube-api-access-q8ftj\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.336234 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-additional-scripts\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.336313 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-run\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.336383 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-run-ovn\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.336405 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-scripts\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.336457 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-log-ovn\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.336533 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8ftj\" (UniqueName: \"kubernetes.io/projected/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-kube-api-access-q8ftj\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.337261 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-run\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.337264 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-log-ovn\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.337278 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-run-ovn\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.338479 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-additional-scripts\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.340906 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-scripts\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.359152 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8ftj\" (UniqueName: \"kubernetes.io/projected/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-kube-api-access-q8ftj\") pod \"ovn-controller-62kkp-config-qgcdl\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.410657 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.887640 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="765973da-6ef4-44c6-823c-5670b03c5900" path="/var/lib/kubelet/pods/765973da-6ef4-44c6-823c-5670b03c5900/volumes" Feb 20 16:50:28 crc kubenswrapper[4697]: I0220 16:50:28.888405 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc77ee14-633c-435e-8169-9ac861f0b228" path="/var/lib/kubelet/pods/fc77ee14-633c-435e-8169-9ac861f0b228/volumes" Feb 20 16:50:31 crc kubenswrapper[4697]: I0220 16:50:31.578027 4697 generic.go:334] "Generic (PLEG): container finished" podID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerID="17f802dae38b06b5b55a2659fd54e20ccd1b40172d6d4e186b3bd93ec13cba6d" exitCode=0 Feb 20 16:50:31 crc kubenswrapper[4697]: I0220 16:50:31.578115 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23464f44-ddc9-4b6e-8e53-6196d0136cc0","Type":"ContainerDied","Data":"17f802dae38b06b5b55a2659fd54e20ccd1b40172d6d4e186b3bd93ec13cba6d"} Feb 20 16:50:34 crc kubenswrapper[4697]: I0220 16:50:34.034178 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/notifications-rabbitmq-server-0" podUID="591f7e7d-78bf-43a5-afe2-119f93765311" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Feb 20 16:50:34 crc kubenswrapper[4697]: I0220 16:50:34.278784 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5224cc9f-d610-4ea0-94da-11cdb019dcce" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Feb 20 16:50:34 crc kubenswrapper[4697]: I0220 16:50:34.579252 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="40ca67b4-1eb6-40a6-ad33-1982ed83eb63" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Feb 20 16:50:34 crc kubenswrapper[4697]: I0220 16:50:34.977643 4697 scope.go:117] "RemoveContainer" containerID="aecf1c037a93e5924982cc68fbff4549bccb459c3817afef621cb031da7578e7" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.029901 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mxxjc" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.105399 4697 scope.go:117] "RemoveContainer" containerID="d8f1c25e72ddbd0f90826a02ecde90644bf87732184fe1cab2d7fa2f912ec797" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.144220 4697 scope.go:117] "RemoveContainer" containerID="189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977" Feb 20 16:50:35 crc kubenswrapper[4697]: E0220 16:50:35.144803 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977\": container with ID starting with 189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977 not found: ID does not exist" containerID="189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.144869 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977"} err="failed to get container status \"189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977\": rpc error: code = NotFound desc = could not find container \"189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977\": container with ID starting with 189ec1263e8b7aab2f485788f7412228f26da0d7a11da901c55ca0ad6ff79977 not found: ID does not exist" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.144921 4697 scope.go:117] "RemoveContainer" containerID="aecf1c037a93e5924982cc68fbff4549bccb459c3817afef621cb031da7578e7" Feb 20 16:50:35 crc kubenswrapper[4697]: E0220 16:50:35.146739 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aecf1c037a93e5924982cc68fbff4549bccb459c3817afef621cb031da7578e7\": container with ID starting with aecf1c037a93e5924982cc68fbff4549bccb459c3817afef621cb031da7578e7 not found: ID does not exist" containerID="aecf1c037a93e5924982cc68fbff4549bccb459c3817afef621cb031da7578e7" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.146778 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aecf1c037a93e5924982cc68fbff4549bccb459c3817afef621cb031da7578e7"} err="failed to get container status \"aecf1c037a93e5924982cc68fbff4549bccb459c3817afef621cb031da7578e7\": rpc error: code = NotFound desc = could not find container \"aecf1c037a93e5924982cc68fbff4549bccb459c3817afef621cb031da7578e7\": container with ID starting with aecf1c037a93e5924982cc68fbff4549bccb459c3817afef621cb031da7578e7 not found: ID does not exist" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.146802 4697 scope.go:117] "RemoveContainer" containerID="d8f1c25e72ddbd0f90826a02ecde90644bf87732184fe1cab2d7fa2f912ec797" Feb 20 16:50:35 crc kubenswrapper[4697]: E0220 16:50:35.148337 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8f1c25e72ddbd0f90826a02ecde90644bf87732184fe1cab2d7fa2f912ec797\": container with ID starting with d8f1c25e72ddbd0f90826a02ecde90644bf87732184fe1cab2d7fa2f912ec797 not found: ID does not exist" containerID="d8f1c25e72ddbd0f90826a02ecde90644bf87732184fe1cab2d7fa2f912ec797" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.148369 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f1c25e72ddbd0f90826a02ecde90644bf87732184fe1cab2d7fa2f912ec797"} err="failed to get container status \"d8f1c25e72ddbd0f90826a02ecde90644bf87732184fe1cab2d7fa2f912ec797\": rpc error: code = NotFound desc = could not find container \"d8f1c25e72ddbd0f90826a02ecde90644bf87732184fe1cab2d7fa2f912ec797\": container with ID starting with d8f1c25e72ddbd0f90826a02ecde90644bf87732184fe1cab2d7fa2f912ec797 not found: ID does not exist" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.155348 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq5lm\" (UniqueName: \"kubernetes.io/projected/28e7c55b-4906-4b63-94ac-5503110f6c8f-kube-api-access-qq5lm\") pod \"28e7c55b-4906-4b63-94ac-5503110f6c8f\" (UID: \"28e7c55b-4906-4b63-94ac-5503110f6c8f\") " Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.155533 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28e7c55b-4906-4b63-94ac-5503110f6c8f-operator-scripts\") pod \"28e7c55b-4906-4b63-94ac-5503110f6c8f\" (UID: \"28e7c55b-4906-4b63-94ac-5503110f6c8f\") " Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.156076 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e7c55b-4906-4b63-94ac-5503110f6c8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28e7c55b-4906-4b63-94ac-5503110f6c8f" (UID: "28e7c55b-4906-4b63-94ac-5503110f6c8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.162229 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e7c55b-4906-4b63-94ac-5503110f6c8f-kube-api-access-qq5lm" (OuterVolumeSpecName: "kube-api-access-qq5lm") pod "28e7c55b-4906-4b63-94ac-5503110f6c8f" (UID: "28e7c55b-4906-4b63-94ac-5503110f6c8f"). InnerVolumeSpecName "kube-api-access-qq5lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.257589 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq5lm\" (UniqueName: \"kubernetes.io/projected/28e7c55b-4906-4b63-94ac-5503110f6c8f-kube-api-access-qq5lm\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.258269 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28e7c55b-4906-4b63-94ac-5503110f6c8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.534307 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-62kkp-config-qgcdl"] Feb 20 16:50:35 crc kubenswrapper[4697]: W0220 16:50:35.552861 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c0d4af5_9fe3_4f84_b98e_58e6197232e3.slice/crio-cd3abe4212273d70232df720210323855e13a44eb0e8526a7b83991abbf817b5 WatchSource:0}: Error finding container cd3abe4212273d70232df720210323855e13a44eb0e8526a7b83991abbf817b5: Status 404 returned error can't find the container with id cd3abe4212273d70232df720210323855e13a44eb0e8526a7b83991abbf817b5 Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.629377 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23464f44-ddc9-4b6e-8e53-6196d0136cc0","Type":"ContainerStarted","Data":"6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9"} Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.631610 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mxxjc" event={"ID":"28e7c55b-4906-4b63-94ac-5503110f6c8f","Type":"ContainerDied","Data":"5b4f1c2b906a060ed60aa5ba49ab03789ba2ad8112e8885106139a1ec666ebb1"} Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.631665 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b4f1c2b906a060ed60aa5ba49ab03789ba2ad8112e8885106139a1ec666ebb1" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.631633 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mxxjc" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.634037 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" event={"ID":"ed52949a-bfb6-4102-bd9f-9ea591aeb295","Type":"ContainerStarted","Data":"43bb90b7cde45f9dd32ed8f6eeba7d2e8821c7c7244d353a87bf75f0802a6a2c"} Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.635627 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:35 crc kubenswrapper[4697]: I0220 16:50:35.655201 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-62kkp-config-qgcdl" event={"ID":"8c0d4af5-9fe3-4f84-b98e-58e6197232e3","Type":"ContainerStarted","Data":"cd3abe4212273d70232df720210323855e13a44eb0e8526a7b83991abbf817b5"} Feb 20 16:50:36 crc kubenswrapper[4697]: I0220 16:50:36.073402 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" podStartSLOduration=11.073379872 podStartE2EDuration="11.073379872s" podCreationTimestamp="2026-02-20 16:50:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:35.666775567 +0000 UTC m=+1143.446820975" watchObservedRunningTime="2026-02-20 16:50:36.073379872 +0000 UTC m=+1143.853425280" Feb 20 16:50:36 crc kubenswrapper[4697]: I0220 16:50:36.666721 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wtqdz" event={"ID":"1c949e32-f57d-4f71-aaae-192d3ceea6de","Type":"ContainerStarted","Data":"a4b4e357940009fb44409ab931c02605e36f2e3f92444a7c26bd5340d07bd903"} Feb 20 16:50:36 crc kubenswrapper[4697]: I0220 16:50:36.668992 4697 generic.go:334] "Generic (PLEG): container finished" podID="8c0d4af5-9fe3-4f84-b98e-58e6197232e3" containerID="025f2c983bd43551b0e9c62d8fa67bc40b843ab775b826c03b5058a2e419c58c" exitCode=0 Feb 20 16:50:36 crc kubenswrapper[4697]: I0220 16:50:36.669039 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-62kkp-config-qgcdl" event={"ID":"8c0d4af5-9fe3-4f84-b98e-58e6197232e3","Type":"ContainerDied","Data":"025f2c983bd43551b0e9c62d8fa67bc40b843ab775b826c03b5058a2e419c58c"} Feb 20 16:50:36 crc kubenswrapper[4697]: I0220 16:50:36.697906 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wtqdz" podStartSLOduration=3.5843481280000002 podStartE2EDuration="19.697878788s" podCreationTimestamp="2026-02-20 16:50:17 +0000 UTC" firstStartedPulling="2026-02-20 16:50:18.991840651 +0000 UTC m=+1126.771886059" lastFinishedPulling="2026-02-20 16:50:35.105371311 +0000 UTC m=+1142.885416719" observedRunningTime="2026-02-20 16:50:36.683643458 +0000 UTC m=+1144.463688876" watchObservedRunningTime="2026-02-20 16:50:36.697878788 +0000 UTC m=+1144.477924246" Feb 20 16:50:37 crc kubenswrapper[4697]: I0220 16:50:37.683407 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23464f44-ddc9-4b6e-8e53-6196d0136cc0","Type":"ContainerStarted","Data":"7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8"} Feb 20 16:50:37 crc kubenswrapper[4697]: I0220 16:50:37.683794 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23464f44-ddc9-4b6e-8e53-6196d0136cc0","Type":"ContainerStarted","Data":"818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4"} Feb 20 16:50:37 crc kubenswrapper[4697]: I0220 16:50:37.756629 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.756610656 podStartE2EDuration="16.756610656s" podCreationTimestamp="2026-02-20 16:50:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:37.744620041 +0000 UTC m=+1145.524665459" watchObservedRunningTime="2026-02-20 16:50:37.756610656 +0000 UTC m=+1145.536656084" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.143396 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.214171 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-run-ovn\") pod \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.214575 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-scripts\") pod \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.214620 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8ftj\" (UniqueName: \"kubernetes.io/projected/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-kube-api-access-q8ftj\") pod \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.214677 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-log-ovn\") pod \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.214714 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-additional-scripts\") pod \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.214737 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-run\") pod \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\" (UID: \"8c0d4af5-9fe3-4f84-b98e-58e6197232e3\") " Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.214264 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8c0d4af5-9fe3-4f84-b98e-58e6197232e3" (UID: "8c0d4af5-9fe3-4f84-b98e-58e6197232e3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.215188 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-run" (OuterVolumeSpecName: "var-run") pod "8c0d4af5-9fe3-4f84-b98e-58e6197232e3" (UID: "8c0d4af5-9fe3-4f84-b98e-58e6197232e3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.215735 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-scripts" (OuterVolumeSpecName: "scripts") pod "8c0d4af5-9fe3-4f84-b98e-58e6197232e3" (UID: "8c0d4af5-9fe3-4f84-b98e-58e6197232e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.215811 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8c0d4af5-9fe3-4f84-b98e-58e6197232e3" (UID: "8c0d4af5-9fe3-4f84-b98e-58e6197232e3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.216040 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8c0d4af5-9fe3-4f84-b98e-58e6197232e3" (UID: "8c0d4af5-9fe3-4f84-b98e-58e6197232e3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.223127 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-kube-api-access-q8ftj" (OuterVolumeSpecName: "kube-api-access-q8ftj") pod "8c0d4af5-9fe3-4f84-b98e-58e6197232e3" (UID: "8c0d4af5-9fe3-4f84-b98e-58e6197232e3"). InnerVolumeSpecName "kube-api-access-q8ftj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.316132 4697 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.316164 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.316174 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8ftj\" (UniqueName: \"kubernetes.io/projected/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-kube-api-access-q8ftj\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.316186 4697 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.316195 4697 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.316205 4697 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c0d4af5-9fe3-4f84-b98e-58e6197232e3-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.692058 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-62kkp-config-qgcdl" Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.692076 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-62kkp-config-qgcdl" event={"ID":"8c0d4af5-9fe3-4f84-b98e-58e6197232e3","Type":"ContainerDied","Data":"cd3abe4212273d70232df720210323855e13a44eb0e8526a7b83991abbf817b5"} Feb 20 16:50:38 crc kubenswrapper[4697]: I0220 16:50:38.692153 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd3abe4212273d70232df720210323855e13a44eb0e8526a7b83991abbf817b5" Feb 20 16:50:39 crc kubenswrapper[4697]: I0220 16:50:39.229871 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-62kkp-config-qgcdl"] Feb 20 16:50:39 crc kubenswrapper[4697]: I0220 16:50:39.238684 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-62kkp-config-qgcdl"] Feb 20 16:50:40 crc kubenswrapper[4697]: I0220 16:50:40.890314 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c0d4af5-9fe3-4f84-b98e-58e6197232e3" path="/var/lib/kubelet/pods/8c0d4af5-9fe3-4f84-b98e-58e6197232e3/volumes" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.237631 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.306460 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8f6ffc97-zdlsd"] Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.306796 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" podUID="51280d75-c146-43f2-967b-bb34e273f534" containerName="dnsmasq-dns" containerID="cri-o://70a64ecf59ec1712cb0795c6b3a95abdb5efff8f175d2f6d4f12ce800db5be9c" gracePeriod=10 Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.715573 4697 generic.go:334] "Generic (PLEG): container finished" podID="51280d75-c146-43f2-967b-bb34e273f534" containerID="70a64ecf59ec1712cb0795c6b3a95abdb5efff8f175d2f6d4f12ce800db5be9c" exitCode=0 Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.715935 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" event={"ID":"51280d75-c146-43f2-967b-bb34e273f534","Type":"ContainerDied","Data":"70a64ecf59ec1712cb0795c6b3a95abdb5efff8f175d2f6d4f12ce800db5be9c"} Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.801472 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.808073 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.873809 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-dns-svc\") pod \"51280d75-c146-43f2-967b-bb34e273f534\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.873986 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5skr8\" (UniqueName: \"kubernetes.io/projected/51280d75-c146-43f2-967b-bb34e273f534-kube-api-access-5skr8\") pod \"51280d75-c146-43f2-967b-bb34e273f534\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.874028 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-config\") pod \"51280d75-c146-43f2-967b-bb34e273f534\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.874091 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-ovsdbserver-sb\") pod \"51280d75-c146-43f2-967b-bb34e273f534\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.874118 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-ovsdbserver-nb\") pod \"51280d75-c146-43f2-967b-bb34e273f534\" (UID: \"51280d75-c146-43f2-967b-bb34e273f534\") " Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.880920 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51280d75-c146-43f2-967b-bb34e273f534-kube-api-access-5skr8" (OuterVolumeSpecName: "kube-api-access-5skr8") pod "51280d75-c146-43f2-967b-bb34e273f534" (UID: "51280d75-c146-43f2-967b-bb34e273f534"). InnerVolumeSpecName "kube-api-access-5skr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.914367 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-config" (OuterVolumeSpecName: "config") pod "51280d75-c146-43f2-967b-bb34e273f534" (UID: "51280d75-c146-43f2-967b-bb34e273f534"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.918595 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "51280d75-c146-43f2-967b-bb34e273f534" (UID: "51280d75-c146-43f2-967b-bb34e273f534"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.920611 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "51280d75-c146-43f2-967b-bb34e273f534" (UID: "51280d75-c146-43f2-967b-bb34e273f534"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.922808 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "51280d75-c146-43f2-967b-bb34e273f534" (UID: "51280d75-c146-43f2-967b-bb34e273f534"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.976504 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5skr8\" (UniqueName: \"kubernetes.io/projected/51280d75-c146-43f2-967b-bb34e273f534-kube-api-access-5skr8\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.976542 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.976556 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.976569 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:41 crc kubenswrapper[4697]: I0220 16:50:41.976581 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51280d75-c146-43f2-967b-bb34e273f534-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:42 crc kubenswrapper[4697]: I0220 16:50:42.724786 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" event={"ID":"51280d75-c146-43f2-967b-bb34e273f534","Type":"ContainerDied","Data":"c55487e9d38def07e799de5bc7dec78d34fbf277348a5f958d0e89665db90f26"} Feb 20 16:50:42 crc kubenswrapper[4697]: I0220 16:50:42.724847 4697 scope.go:117] "RemoveContainer" containerID="70a64ecf59ec1712cb0795c6b3a95abdb5efff8f175d2f6d4f12ce800db5be9c" Feb 20 16:50:42 crc kubenswrapper[4697]: I0220 16:50:42.724861 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8f6ffc97-zdlsd" Feb 20 16:50:42 crc kubenswrapper[4697]: I0220 16:50:42.762490 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8f6ffc97-zdlsd"] Feb 20 16:50:42 crc kubenswrapper[4697]: I0220 16:50:42.762588 4697 scope.go:117] "RemoveContainer" containerID="ce86da0a4e0ad24afde9d0476338cef5ba1dd97a2eec0575c25123847bc56810" Feb 20 16:50:42 crc kubenswrapper[4697]: I0220 16:50:42.771836 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8f6ffc97-zdlsd"] Feb 20 16:50:42 crc kubenswrapper[4697]: I0220 16:50:42.915029 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51280d75-c146-43f2-967b-bb34e273f534" path="/var/lib/kubelet/pods/51280d75-c146-43f2-967b-bb34e273f534/volumes" Feb 20 16:50:43 crc kubenswrapper[4697]: I0220 16:50:43.743981 4697 generic.go:334] "Generic (PLEG): container finished" podID="1c949e32-f57d-4f71-aaae-192d3ceea6de" containerID="a4b4e357940009fb44409ab931c02605e36f2e3f92444a7c26bd5340d07bd903" exitCode=0 Feb 20 16:50:43 crc kubenswrapper[4697]: I0220 16:50:43.744414 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wtqdz" event={"ID":"1c949e32-f57d-4f71-aaae-192d3ceea6de","Type":"ContainerDied","Data":"a4b4e357940009fb44409ab931c02605e36f2e3f92444a7c26bd5340d07bd903"} Feb 20 16:50:44 crc kubenswrapper[4697]: I0220 16:50:44.033663 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/notifications-rabbitmq-server-0" Feb 20 16:50:44 crc kubenswrapper[4697]: I0220 16:50:44.277227 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 20 16:50:44 crc kubenswrapper[4697]: I0220 16:50:44.579668 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.265743 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.435839 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-combined-ca-bundle\") pod \"1c949e32-f57d-4f71-aaae-192d3ceea6de\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.435925 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-config-data\") pod \"1c949e32-f57d-4f71-aaae-192d3ceea6de\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.436002 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zw6l\" (UniqueName: \"kubernetes.io/projected/1c949e32-f57d-4f71-aaae-192d3ceea6de-kube-api-access-4zw6l\") pod \"1c949e32-f57d-4f71-aaae-192d3ceea6de\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.436077 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-db-sync-config-data\") pod \"1c949e32-f57d-4f71-aaae-192d3ceea6de\" (UID: \"1c949e32-f57d-4f71-aaae-192d3ceea6de\") " Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.442596 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1c949e32-f57d-4f71-aaae-192d3ceea6de" (UID: "1c949e32-f57d-4f71-aaae-192d3ceea6de"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.444514 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c949e32-f57d-4f71-aaae-192d3ceea6de-kube-api-access-4zw6l" (OuterVolumeSpecName: "kube-api-access-4zw6l") pod "1c949e32-f57d-4f71-aaae-192d3ceea6de" (UID: "1c949e32-f57d-4f71-aaae-192d3ceea6de"). InnerVolumeSpecName "kube-api-access-4zw6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.460743 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c949e32-f57d-4f71-aaae-192d3ceea6de" (UID: "1c949e32-f57d-4f71-aaae-192d3ceea6de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.486982 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-config-data" (OuterVolumeSpecName: "config-data") pod "1c949e32-f57d-4f71-aaae-192d3ceea6de" (UID: "1c949e32-f57d-4f71-aaae-192d3ceea6de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.537459 4697 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.537488 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.537504 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c949e32-f57d-4f71-aaae-192d3ceea6de-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.537513 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zw6l\" (UniqueName: \"kubernetes.io/projected/1c949e32-f57d-4f71-aaae-192d3ceea6de-kube-api-access-4zw6l\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.760068 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wtqdz" event={"ID":"1c949e32-f57d-4f71-aaae-192d3ceea6de","Type":"ContainerDied","Data":"00a89dee06dfdbb6a84832e045b4fe0eae87684e8b92c22dba4da70bb4e25efa"} Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.760103 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00a89dee06dfdbb6a84832e045b4fe0eae87684e8b92c22dba4da70bb4e25efa" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.760121 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wtqdz" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.828074 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-hj5zq"] Feb 20 16:50:45 crc kubenswrapper[4697]: E0220 16:50:45.828396 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0d4af5-9fe3-4f84-b98e-58e6197232e3" containerName="ovn-config" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.828415 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0d4af5-9fe3-4f84-b98e-58e6197232e3" containerName="ovn-config" Feb 20 16:50:45 crc kubenswrapper[4697]: E0220 16:50:45.828445 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51280d75-c146-43f2-967b-bb34e273f534" containerName="init" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.828454 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="51280d75-c146-43f2-967b-bb34e273f534" containerName="init" Feb 20 16:50:45 crc kubenswrapper[4697]: E0220 16:50:45.828468 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e7c55b-4906-4b63-94ac-5503110f6c8f" containerName="mariadb-account-create-update" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.828476 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e7c55b-4906-4b63-94ac-5503110f6c8f" containerName="mariadb-account-create-update" Feb 20 16:50:45 crc kubenswrapper[4697]: E0220 16:50:45.828485 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51280d75-c146-43f2-967b-bb34e273f534" containerName="dnsmasq-dns" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.828491 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="51280d75-c146-43f2-967b-bb34e273f534" containerName="dnsmasq-dns" Feb 20 16:50:45 crc kubenswrapper[4697]: E0220 16:50:45.828501 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c949e32-f57d-4f71-aaae-192d3ceea6de" containerName="glance-db-sync" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.828506 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c949e32-f57d-4f71-aaae-192d3ceea6de" containerName="glance-db-sync" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.828686 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="51280d75-c146-43f2-967b-bb34e273f534" containerName="dnsmasq-dns" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.828700 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c949e32-f57d-4f71-aaae-192d3ceea6de" containerName="glance-db-sync" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.828711 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c0d4af5-9fe3-4f84-b98e-58e6197232e3" containerName="ovn-config" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.828721 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e7c55b-4906-4b63-94ac-5503110f6c8f" containerName="mariadb-account-create-update" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.829240 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hj5zq" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.852669 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hj5zq"] Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.942406 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ca09dc-ef65-4fec-8fea-eb06f96f3818-operator-scripts\") pod \"cinder-db-create-hj5zq\" (UID: \"c0ca09dc-ef65-4fec-8fea-eb06f96f3818\") " pod="openstack/cinder-db-create-hj5zq" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.942533 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68msz\" (UniqueName: \"kubernetes.io/projected/c0ca09dc-ef65-4fec-8fea-eb06f96f3818-kube-api-access-68msz\") pod \"cinder-db-create-hj5zq\" (UID: \"c0ca09dc-ef65-4fec-8fea-eb06f96f3818\") " pod="openstack/cinder-db-create-hj5zq" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.945082 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-fd85-account-create-update-c4m4q"] Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.947304 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fd85-account-create-update-c4m4q" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.949558 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 20 16:50:45 crc kubenswrapper[4697]: I0220 16:50:45.956421 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-fd85-account-create-update-c4m4q"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.045450 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5a353b-3fec-411e-8508-3b440af66824-operator-scripts\") pod \"barbican-fd85-account-create-update-c4m4q\" (UID: \"df5a353b-3fec-411e-8508-3b440af66824\") " pod="openstack/barbican-fd85-account-create-update-c4m4q" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.045536 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ca09dc-ef65-4fec-8fea-eb06f96f3818-operator-scripts\") pod \"cinder-db-create-hj5zq\" (UID: \"c0ca09dc-ef65-4fec-8fea-eb06f96f3818\") " pod="openstack/cinder-db-create-hj5zq" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.045610 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slkb9\" (UniqueName: \"kubernetes.io/projected/df5a353b-3fec-411e-8508-3b440af66824-kube-api-access-slkb9\") pod \"barbican-fd85-account-create-update-c4m4q\" (UID: \"df5a353b-3fec-411e-8508-3b440af66824\") " pod="openstack/barbican-fd85-account-create-update-c4m4q" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.045659 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68msz\" (UniqueName: \"kubernetes.io/projected/c0ca09dc-ef65-4fec-8fea-eb06f96f3818-kube-api-access-68msz\") pod \"cinder-db-create-hj5zq\" (UID: \"c0ca09dc-ef65-4fec-8fea-eb06f96f3818\") " pod="openstack/cinder-db-create-hj5zq" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.046527 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ca09dc-ef65-4fec-8fea-eb06f96f3818-operator-scripts\") pod \"cinder-db-create-hj5zq\" (UID: \"c0ca09dc-ef65-4fec-8fea-eb06f96f3818\") " pod="openstack/cinder-db-create-hj5zq" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.048752 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-554gk"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.049861 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-554gk" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.066425 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-554gk"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.074454 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e438-account-create-update-pwcsf"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.075551 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e438-account-create-update-pwcsf" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.076195 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68msz\" (UniqueName: \"kubernetes.io/projected/c0ca09dc-ef65-4fec-8fea-eb06f96f3818-kube-api-access-68msz\") pod \"cinder-db-create-hj5zq\" (UID: \"c0ca09dc-ef65-4fec-8fea-eb06f96f3818\") " pod="openstack/cinder-db-create-hj5zq" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.078614 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.105948 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e438-account-create-update-pwcsf"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.143272 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-v6dg9"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.144339 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.146766 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slkb9\" (UniqueName: \"kubernetes.io/projected/df5a353b-3fec-411e-8508-3b440af66824-kube-api-access-slkb9\") pod \"barbican-fd85-account-create-update-c4m4q\" (UID: \"df5a353b-3fec-411e-8508-3b440af66824\") " pod="openstack/barbican-fd85-account-create-update-c4m4q" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.146839 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5a353b-3fec-411e-8508-3b440af66824-operator-scripts\") pod \"barbican-fd85-account-create-update-c4m4q\" (UID: \"df5a353b-3fec-411e-8508-3b440af66824\") " pod="openstack/barbican-fd85-account-create-update-c4m4q" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.146885 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/785b68ff-7aef-47b3-85d8-587116e8ce3a-operator-scripts\") pod \"barbican-db-create-554gk\" (UID: \"785b68ff-7aef-47b3-85d8-587116e8ce3a\") " pod="openstack/barbican-db-create-554gk" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.146926 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5r49\" (UniqueName: \"kubernetes.io/projected/785b68ff-7aef-47b3-85d8-587116e8ce3a-kube-api-access-v5r49\") pod \"barbican-db-create-554gk\" (UID: \"785b68ff-7aef-47b3-85d8-587116e8ce3a\") " pod="openstack/barbican-db-create-554gk" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.147646 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5a353b-3fec-411e-8508-3b440af66824-operator-scripts\") pod \"barbican-fd85-account-create-update-c4m4q\" (UID: \"df5a353b-3fec-411e-8508-3b440af66824\") " pod="openstack/barbican-fd85-account-create-update-c4m4q" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.148790 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.148989 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.149655 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hj5zq" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.153643 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.153805 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nd79" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.174769 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-v6dg9"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.185194 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slkb9\" (UniqueName: \"kubernetes.io/projected/df5a353b-3fec-411e-8508-3b440af66824-kube-api-access-slkb9\") pod \"barbican-fd85-account-create-update-c4m4q\" (UID: \"df5a353b-3fec-411e-8508-3b440af66824\") " pod="openstack/barbican-fd85-account-create-update-c4m4q" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.249041 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dxwm\" (UniqueName: \"kubernetes.io/projected/6d2e30bf-2042-4e8d-b465-f1cfe98b2678-kube-api-access-5dxwm\") pod \"cinder-e438-account-create-update-pwcsf\" (UID: \"6d2e30bf-2042-4e8d-b465-f1cfe98b2678\") " pod="openstack/cinder-e438-account-create-update-pwcsf" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.249408 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/785b68ff-7aef-47b3-85d8-587116e8ce3a-operator-scripts\") pod \"barbican-db-create-554gk\" (UID: \"785b68ff-7aef-47b3-85d8-587116e8ce3a\") " pod="openstack/barbican-db-create-554gk" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.249473 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5r49\" (UniqueName: \"kubernetes.io/projected/785b68ff-7aef-47b3-85d8-587116e8ce3a-kube-api-access-v5r49\") pod \"barbican-db-create-554gk\" (UID: \"785b68ff-7aef-47b3-85d8-587116e8ce3a\") " pod="openstack/barbican-db-create-554gk" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.249525 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cf1f6d-5da1-4b45-9dda-dbd21e972107-config-data\") pod \"keystone-db-sync-v6dg9\" (UID: \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\") " pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.249546 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk44b\" (UniqueName: \"kubernetes.io/projected/52cf1f6d-5da1-4b45-9dda-dbd21e972107-kube-api-access-dk44b\") pod \"keystone-db-sync-v6dg9\" (UID: \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\") " pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.249587 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2e30bf-2042-4e8d-b465-f1cfe98b2678-operator-scripts\") pod \"cinder-e438-account-create-update-pwcsf\" (UID: \"6d2e30bf-2042-4e8d-b465-f1cfe98b2678\") " pod="openstack/cinder-e438-account-create-update-pwcsf" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.249612 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cf1f6d-5da1-4b45-9dda-dbd21e972107-combined-ca-bundle\") pod \"keystone-db-sync-v6dg9\" (UID: \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\") " pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.250310 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/785b68ff-7aef-47b3-85d8-587116e8ce3a-operator-scripts\") pod \"barbican-db-create-554gk\" (UID: \"785b68ff-7aef-47b3-85d8-587116e8ce3a\") " pod="openstack/barbican-db-create-554gk" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.276581 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5r49\" (UniqueName: \"kubernetes.io/projected/785b68ff-7aef-47b3-85d8-587116e8ce3a-kube-api-access-v5r49\") pod \"barbican-db-create-554gk\" (UID: \"785b68ff-7aef-47b3-85d8-587116e8ce3a\") " pod="openstack/barbican-db-create-554gk" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.314227 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fd85-account-create-update-c4m4q" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.319228 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-lbhl4"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.320326 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lbhl4" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.348894 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lbhl4"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.353985 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2e30bf-2042-4e8d-b465-f1cfe98b2678-operator-scripts\") pod \"cinder-e438-account-create-update-pwcsf\" (UID: \"6d2e30bf-2042-4e8d-b465-f1cfe98b2678\") " pod="openstack/cinder-e438-account-create-update-pwcsf" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.354042 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cf1f6d-5da1-4b45-9dda-dbd21e972107-combined-ca-bundle\") pod \"keystone-db-sync-v6dg9\" (UID: \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\") " pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.354102 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dxwm\" (UniqueName: \"kubernetes.io/projected/6d2e30bf-2042-4e8d-b465-f1cfe98b2678-kube-api-access-5dxwm\") pod \"cinder-e438-account-create-update-pwcsf\" (UID: \"6d2e30bf-2042-4e8d-b465-f1cfe98b2678\") " pod="openstack/cinder-e438-account-create-update-pwcsf" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.354194 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cf1f6d-5da1-4b45-9dda-dbd21e972107-config-data\") pod \"keystone-db-sync-v6dg9\" (UID: \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\") " pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.354223 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk44b\" (UniqueName: \"kubernetes.io/projected/52cf1f6d-5da1-4b45-9dda-dbd21e972107-kube-api-access-dk44b\") pod \"keystone-db-sync-v6dg9\" (UID: \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\") " pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.354930 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2e30bf-2042-4e8d-b465-f1cfe98b2678-operator-scripts\") pod \"cinder-e438-account-create-update-pwcsf\" (UID: \"6d2e30bf-2042-4e8d-b465-f1cfe98b2678\") " pod="openstack/cinder-e438-account-create-update-pwcsf" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.359731 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3eb9-account-create-update-ps2nd"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.359802 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cf1f6d-5da1-4b45-9dda-dbd21e972107-combined-ca-bundle\") pod \"keystone-db-sync-v6dg9\" (UID: \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\") " pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.361085 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3eb9-account-create-update-ps2nd" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.365217 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cf1f6d-5da1-4b45-9dda-dbd21e972107-config-data\") pod \"keystone-db-sync-v6dg9\" (UID: \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\") " pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.370843 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.378018 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-554gk" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.388517 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3eb9-account-create-update-ps2nd"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.389286 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dxwm\" (UniqueName: \"kubernetes.io/projected/6d2e30bf-2042-4e8d-b465-f1cfe98b2678-kube-api-access-5dxwm\") pod \"cinder-e438-account-create-update-pwcsf\" (UID: \"6d2e30bf-2042-4e8d-b465-f1cfe98b2678\") " pod="openstack/cinder-e438-account-create-update-pwcsf" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.395067 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk44b\" (UniqueName: \"kubernetes.io/projected/52cf1f6d-5da1-4b45-9dda-dbd21e972107-kube-api-access-dk44b\") pod \"keystone-db-sync-v6dg9\" (UID: \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\") " pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.395109 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e438-account-create-update-pwcsf" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.455277 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcq6p\" (UniqueName: \"kubernetes.io/projected/87689cb4-4d4f-431a-958e-23385b3fa01d-kube-api-access-mcq6p\") pod \"neutron-3eb9-account-create-update-ps2nd\" (UID: \"87689cb4-4d4f-431a-958e-23385b3fa01d\") " pod="openstack/neutron-3eb9-account-create-update-ps2nd" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.455380 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb-operator-scripts\") pod \"neutron-db-create-lbhl4\" (UID: \"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb\") " pod="openstack/neutron-db-create-lbhl4" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.455409 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t22dz\" (UniqueName: \"kubernetes.io/projected/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb-kube-api-access-t22dz\") pod \"neutron-db-create-lbhl4\" (UID: \"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb\") " pod="openstack/neutron-db-create-lbhl4" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.455481 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87689cb4-4d4f-431a-958e-23385b3fa01d-operator-scripts\") pod \"neutron-3eb9-account-create-update-ps2nd\" (UID: \"87689cb4-4d4f-431a-958e-23385b3fa01d\") " pod="openstack/neutron-3eb9-account-create-update-ps2nd" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.506035 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d9f898cc5-9k7vr"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.507623 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.519397 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d9f898cc5-9k7vr"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.556463 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb-operator-scripts\") pod \"neutron-db-create-lbhl4\" (UID: \"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb\") " pod="openstack/neutron-db-create-lbhl4" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.556525 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t22dz\" (UniqueName: \"kubernetes.io/projected/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb-kube-api-access-t22dz\") pod \"neutron-db-create-lbhl4\" (UID: \"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb\") " pod="openstack/neutron-db-create-lbhl4" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.556594 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87689cb4-4d4f-431a-958e-23385b3fa01d-operator-scripts\") pod \"neutron-3eb9-account-create-update-ps2nd\" (UID: \"87689cb4-4d4f-431a-958e-23385b3fa01d\") " pod="openstack/neutron-3eb9-account-create-update-ps2nd" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.556641 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcq6p\" (UniqueName: \"kubernetes.io/projected/87689cb4-4d4f-431a-958e-23385b3fa01d-kube-api-access-mcq6p\") pod \"neutron-3eb9-account-create-update-ps2nd\" (UID: \"87689cb4-4d4f-431a-958e-23385b3fa01d\") " pod="openstack/neutron-3eb9-account-create-update-ps2nd" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.557463 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87689cb4-4d4f-431a-958e-23385b3fa01d-operator-scripts\") pod \"neutron-3eb9-account-create-update-ps2nd\" (UID: \"87689cb4-4d4f-431a-958e-23385b3fa01d\") " pod="openstack/neutron-3eb9-account-create-update-ps2nd" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.558009 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb-operator-scripts\") pod \"neutron-db-create-lbhl4\" (UID: \"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb\") " pod="openstack/neutron-db-create-lbhl4" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.558795 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.582562 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t22dz\" (UniqueName: \"kubernetes.io/projected/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb-kube-api-access-t22dz\") pod \"neutron-db-create-lbhl4\" (UID: \"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb\") " pod="openstack/neutron-db-create-lbhl4" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.585829 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcq6p\" (UniqueName: \"kubernetes.io/projected/87689cb4-4d4f-431a-958e-23385b3fa01d-kube-api-access-mcq6p\") pod \"neutron-3eb9-account-create-update-ps2nd\" (UID: \"87689cb4-4d4f-431a-958e-23385b3fa01d\") " pod="openstack/neutron-3eb9-account-create-update-ps2nd" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.639074 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lbhl4" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.657996 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-config\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.658062 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brhb2\" (UniqueName: \"kubernetes.io/projected/8dc1f573-c5c3-436b-9c10-820add978d5b-kube-api-access-brhb2\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.658086 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-dns-svc\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.658155 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-dns-swift-storage-0\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.658176 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-ovsdbserver-nb\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.658216 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-ovsdbserver-sb\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.716853 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-f7c6f"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.718195 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.732877 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-4pjll" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.751926 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.763275 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-config\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.763333 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brhb2\" (UniqueName: \"kubernetes.io/projected/8dc1f573-c5c3-436b-9c10-820add978d5b-kube-api-access-brhb2\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.763365 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-dns-svc\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.763424 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-dns-swift-storage-0\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.763457 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-ovsdbserver-nb\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.763487 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-ovsdbserver-sb\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.764609 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-ovsdbserver-sb\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.765146 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-dns-svc\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.770384 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-dns-swift-storage-0\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.774038 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-ovsdbserver-nb\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.774588 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-config\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.798862 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3eb9-account-create-update-ps2nd" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.804506 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brhb2\" (UniqueName: \"kubernetes.io/projected/8dc1f573-c5c3-436b-9c10-820add978d5b-kube-api-access-brhb2\") pod \"dnsmasq-dns-d9f898cc5-9k7vr\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.852627 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-f7c6f"] Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.869321 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-db-sync-config-data\") pod \"watcher-db-sync-f7c6f\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.869394 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v454r\" (UniqueName: \"kubernetes.io/projected/962571a7-edad-4e80-94b1-f8f1ba8621e4-kube-api-access-v454r\") pod \"watcher-db-sync-f7c6f\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.869667 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-config-data\") pod \"watcher-db-sync-f7c6f\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.869709 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-combined-ca-bundle\") pod \"watcher-db-sync-f7c6f\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.869931 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.971294 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v454r\" (UniqueName: \"kubernetes.io/projected/962571a7-edad-4e80-94b1-f8f1ba8621e4-kube-api-access-v454r\") pod \"watcher-db-sync-f7c6f\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.971397 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-config-data\") pod \"watcher-db-sync-f7c6f\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.971423 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-combined-ca-bundle\") pod \"watcher-db-sync-f7c6f\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.971526 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-db-sync-config-data\") pod \"watcher-db-sync-f7c6f\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.975710 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-db-sync-config-data\") pod \"watcher-db-sync-f7c6f\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.979193 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-config-data\") pod \"watcher-db-sync-f7c6f\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:46 crc kubenswrapper[4697]: I0220 16:50:46.999940 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-combined-ca-bundle\") pod \"watcher-db-sync-f7c6f\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.003903 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v454r\" (UniqueName: \"kubernetes.io/projected/962571a7-edad-4e80-94b1-f8f1ba8621e4-kube-api-access-v454r\") pod \"watcher-db-sync-f7c6f\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.073735 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hj5zq"] Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.086905 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.186457 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-554gk"] Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.384815 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e438-account-create-update-pwcsf"] Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.397678 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-fd85-account-create-update-c4m4q"] Feb 20 16:50:47 crc kubenswrapper[4697]: W0220 16:50:47.398201 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf5a353b_3fec_411e_8508_3b440af66824.slice/crio-c56cf50e70b5ef952b1e5284b8f22566cdaa499bb37e611b31c489ea63b13229 WatchSource:0}: Error finding container c56cf50e70b5ef952b1e5284b8f22566cdaa499bb37e611b31c489ea63b13229: Status 404 returned error can't find the container with id c56cf50e70b5ef952b1e5284b8f22566cdaa499bb37e611b31c489ea63b13229 Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.622608 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3eb9-account-create-update-ps2nd"] Feb 20 16:50:47 crc kubenswrapper[4697]: W0220 16:50:47.623886 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87689cb4_4d4f_431a_958e_23385b3fa01d.slice/crio-275f9384758f306e5527beac8936004daae140601e842cb7973707a814df0293 WatchSource:0}: Error finding container 275f9384758f306e5527beac8936004daae140601e842cb7973707a814df0293: Status 404 returned error can't find the container with id 275f9384758f306e5527beac8936004daae140601e842cb7973707a814df0293 Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.653509 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lbhl4"] Feb 20 16:50:47 crc kubenswrapper[4697]: W0220 16:50:47.663380 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod085ec1b2_25fd_4df4_b5d2_3d9d04ade6bb.slice/crio-d89b14c72208ed5afa08c2290ddef4507199703e0eac9f07ab94e0e419d1f7fa WatchSource:0}: Error finding container d89b14c72208ed5afa08c2290ddef4507199703e0eac9f07ab94e0e419d1f7fa: Status 404 returned error can't find the container with id d89b14c72208ed5afa08c2290ddef4507199703e0eac9f07ab94e0e419d1f7fa Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.820938 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3eb9-account-create-update-ps2nd" event={"ID":"87689cb4-4d4f-431a-958e-23385b3fa01d","Type":"ContainerStarted","Data":"275f9384758f306e5527beac8936004daae140601e842cb7973707a814df0293"} Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.821955 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-v6dg9"] Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.822018 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hj5zq" event={"ID":"c0ca09dc-ef65-4fec-8fea-eb06f96f3818","Type":"ContainerStarted","Data":"701fbb508a90afd52fab6ef8aacdcb92353cd3b1bea2e88c2e4b55dc45265dd6"} Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.822046 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hj5zq" event={"ID":"c0ca09dc-ef65-4fec-8fea-eb06f96f3818","Type":"ContainerStarted","Data":"30219b86571cd46df2ab7d6d96cd343b07cd816d76b4300641855c0688258293"} Feb 20 16:50:47 crc kubenswrapper[4697]: W0220 16:50:47.822658 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52cf1f6d_5da1_4b45_9dda_dbd21e972107.slice/crio-85b4ec61fe13bc52f3ed617b11b97d3a80e62da5b3d0f9fbaca9e13777c732f7 WatchSource:0}: Error finding container 85b4ec61fe13bc52f3ed617b11b97d3a80e62da5b3d0f9fbaca9e13777c732f7: Status 404 returned error can't find the container with id 85b4ec61fe13bc52f3ed617b11b97d3a80e62da5b3d0f9fbaca9e13777c732f7 Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.823678 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e438-account-create-update-pwcsf" event={"ID":"6d2e30bf-2042-4e8d-b465-f1cfe98b2678","Type":"ContainerStarted","Data":"401a02c5a3493603acce6400b53996e036e2e84421253931a083e50d178add7d"} Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.823708 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e438-account-create-update-pwcsf" event={"ID":"6d2e30bf-2042-4e8d-b465-f1cfe98b2678","Type":"ContainerStarted","Data":"d1811a39284e5f5ec51581d69bfa05d29d1a7fa426a8cd568c865d507a06d3d4"} Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.830546 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fd85-account-create-update-c4m4q" event={"ID":"df5a353b-3fec-411e-8508-3b440af66824","Type":"ContainerStarted","Data":"31ebe6109cf2053b73b2151a184f8a4aab762a373433c298cd578bb25ba70f58"} Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.830585 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fd85-account-create-update-c4m4q" event={"ID":"df5a353b-3fec-411e-8508-3b440af66824","Type":"ContainerStarted","Data":"c56cf50e70b5ef952b1e5284b8f22566cdaa499bb37e611b31c489ea63b13229"} Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.832856 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lbhl4" event={"ID":"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb","Type":"ContainerStarted","Data":"d89b14c72208ed5afa08c2290ddef4507199703e0eac9f07ab94e0e419d1f7fa"} Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.842021 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-554gk" event={"ID":"785b68ff-7aef-47b3-85d8-587116e8ce3a","Type":"ContainerStarted","Data":"95a397c462e7f2e5fa2c00f58fbfe54275d815e830be489dda70c58d23cdba9a"} Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.842067 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-554gk" event={"ID":"785b68ff-7aef-47b3-85d8-587116e8ce3a","Type":"ContainerStarted","Data":"4234c400e3bedaf4261f76696756ce41e76ec27efee3bdbfc398513580578f84"} Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.853023 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d9f898cc5-9k7vr"] Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.853777 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-hj5zq" podStartSLOduration=2.853768057 podStartE2EDuration="2.853768057s" podCreationTimestamp="2026-02-20 16:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:47.844795926 +0000 UTC m=+1155.624841324" watchObservedRunningTime="2026-02-20 16:50:47.853768057 +0000 UTC m=+1155.633813465" Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.873370 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-554gk" podStartSLOduration=1.873354588 podStartE2EDuration="1.873354588s" podCreationTimestamp="2026-02-20 16:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:47.867536035 +0000 UTC m=+1155.647581443" watchObservedRunningTime="2026-02-20 16:50:47.873354588 +0000 UTC m=+1155.653399986" Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.901251 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e438-account-create-update-pwcsf" podStartSLOduration=1.901234632 podStartE2EDuration="1.901234632s" podCreationTimestamp="2026-02-20 16:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:47.889663688 +0000 UTC m=+1155.669709096" watchObservedRunningTime="2026-02-20 16:50:47.901234632 +0000 UTC m=+1155.681280030" Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.901638 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-f7c6f"] Feb 20 16:50:47 crc kubenswrapper[4697]: I0220 16:50:47.935809 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-fd85-account-create-update-c4m4q" podStartSLOduration=2.935789251 podStartE2EDuration="2.935789251s" podCreationTimestamp="2026-02-20 16:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:47.91864466 +0000 UTC m=+1155.698690068" watchObservedRunningTime="2026-02-20 16:50:47.935789251 +0000 UTC m=+1155.715834659" Feb 20 16:50:48 crc kubenswrapper[4697]: E0220 16:50:48.614842 4697 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dc1f573_c5c3_436b_9c10_820add978d5b.slice/crio-conmon-c9656f2774e350da3d1bb1d9a56a8c63cef06abda935efd0fee0b1a6d0dd296a.scope\": RecentStats: unable to find data in memory cache]" Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.855905 4697 generic.go:334] "Generic (PLEG): container finished" podID="df5a353b-3fec-411e-8508-3b440af66824" containerID="31ebe6109cf2053b73b2151a184f8a4aab762a373433c298cd578bb25ba70f58" exitCode=0 Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.856144 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fd85-account-create-update-c4m4q" event={"ID":"df5a353b-3fec-411e-8508-3b440af66824","Type":"ContainerDied","Data":"31ebe6109cf2053b73b2151a184f8a4aab762a373433c298cd578bb25ba70f58"} Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.857958 4697 generic.go:334] "Generic (PLEG): container finished" podID="085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb" containerID="55f783900b9f281c401d65d9c7635229d628e5c1c43ff6a9627d4f15d3b15231" exitCode=0 Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.858030 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lbhl4" event={"ID":"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb","Type":"ContainerDied","Data":"55f783900b9f281c401d65d9c7635229d628e5c1c43ff6a9627d4f15d3b15231"} Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.860666 4697 generic.go:334] "Generic (PLEG): container finished" podID="785b68ff-7aef-47b3-85d8-587116e8ce3a" containerID="95a397c462e7f2e5fa2c00f58fbfe54275d815e830be489dda70c58d23cdba9a" exitCode=0 Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.860746 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-554gk" event={"ID":"785b68ff-7aef-47b3-85d8-587116e8ce3a","Type":"ContainerDied","Data":"95a397c462e7f2e5fa2c00f58fbfe54275d815e830be489dda70c58d23cdba9a"} Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.862744 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v6dg9" event={"ID":"52cf1f6d-5da1-4b45-9dda-dbd21e972107","Type":"ContainerStarted","Data":"85b4ec61fe13bc52f3ed617b11b97d3a80e62da5b3d0f9fbaca9e13777c732f7"} Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.864317 4697 generic.go:334] "Generic (PLEG): container finished" podID="8dc1f573-c5c3-436b-9c10-820add978d5b" containerID="c9656f2774e350da3d1bb1d9a56a8c63cef06abda935efd0fee0b1a6d0dd296a" exitCode=0 Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.864385 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" event={"ID":"8dc1f573-c5c3-436b-9c10-820add978d5b","Type":"ContainerDied","Data":"c9656f2774e350da3d1bb1d9a56a8c63cef06abda935efd0fee0b1a6d0dd296a"} Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.864424 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" event={"ID":"8dc1f573-c5c3-436b-9c10-820add978d5b","Type":"ContainerStarted","Data":"2c172174b02dd5c372f5f17a2daae55b13462cb0a7b7e7c73170558ce3419c9e"} Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.866828 4697 generic.go:334] "Generic (PLEG): container finished" podID="6d2e30bf-2042-4e8d-b465-f1cfe98b2678" containerID="401a02c5a3493603acce6400b53996e036e2e84421253931a083e50d178add7d" exitCode=0 Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.866873 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e438-account-create-update-pwcsf" event={"ID":"6d2e30bf-2042-4e8d-b465-f1cfe98b2678","Type":"ContainerDied","Data":"401a02c5a3493603acce6400b53996e036e2e84421253931a083e50d178add7d"} Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.869006 4697 generic.go:334] "Generic (PLEG): container finished" podID="87689cb4-4d4f-431a-958e-23385b3fa01d" containerID="9816d60e5ac117fb51a276c86ccbe4935cbb8389ee92837b5e77762786a9b101" exitCode=0 Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.869044 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3eb9-account-create-update-ps2nd" event={"ID":"87689cb4-4d4f-431a-958e-23385b3fa01d","Type":"ContainerDied","Data":"9816d60e5ac117fb51a276c86ccbe4935cbb8389ee92837b5e77762786a9b101"} Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.875119 4697 generic.go:334] "Generic (PLEG): container finished" podID="c0ca09dc-ef65-4fec-8fea-eb06f96f3818" containerID="701fbb508a90afd52fab6ef8aacdcb92353cd3b1bea2e88c2e4b55dc45265dd6" exitCode=0 Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.875193 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hj5zq" event={"ID":"c0ca09dc-ef65-4fec-8fea-eb06f96f3818","Type":"ContainerDied","Data":"701fbb508a90afd52fab6ef8aacdcb92353cd3b1bea2e88c2e4b55dc45265dd6"} Feb 20 16:50:48 crc kubenswrapper[4697]: I0220 16:50:48.908759 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-f7c6f" event={"ID":"962571a7-edad-4e80-94b1-f8f1ba8621e4","Type":"ContainerStarted","Data":"9a2a47fa61e4683bab0bc85ab9bbeb18bed435a42defb8b7a1b0beabda0148ff"} Feb 20 16:50:49 crc kubenswrapper[4697]: I0220 16:50:49.903061 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" event={"ID":"8dc1f573-c5c3-436b-9c10-820add978d5b","Type":"ContainerStarted","Data":"20536d476bffd66b144be093eb27eb0aa93f06d37191f8f856c532b9ec985e0c"} Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.377129 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3eb9-account-create-update-ps2nd" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.421975 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" podStartSLOduration=4.421959092 podStartE2EDuration="4.421959092s" podCreationTimestamp="2026-02-20 16:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:50:49.932028781 +0000 UTC m=+1157.712074209" watchObservedRunningTime="2026-02-20 16:50:50.421959092 +0000 UTC m=+1158.202004500" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.439034 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcq6p\" (UniqueName: \"kubernetes.io/projected/87689cb4-4d4f-431a-958e-23385b3fa01d-kube-api-access-mcq6p\") pod \"87689cb4-4d4f-431a-958e-23385b3fa01d\" (UID: \"87689cb4-4d4f-431a-958e-23385b3fa01d\") " Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.439248 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87689cb4-4d4f-431a-958e-23385b3fa01d-operator-scripts\") pod \"87689cb4-4d4f-431a-958e-23385b3fa01d\" (UID: \"87689cb4-4d4f-431a-958e-23385b3fa01d\") " Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.440122 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87689cb4-4d4f-431a-958e-23385b3fa01d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87689cb4-4d4f-431a-958e-23385b3fa01d" (UID: "87689cb4-4d4f-431a-958e-23385b3fa01d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.448341 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87689cb4-4d4f-431a-958e-23385b3fa01d-kube-api-access-mcq6p" (OuterVolumeSpecName: "kube-api-access-mcq6p") pod "87689cb4-4d4f-431a-958e-23385b3fa01d" (UID: "87689cb4-4d4f-431a-958e-23385b3fa01d"). InnerVolumeSpecName "kube-api-access-mcq6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.542037 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcq6p\" (UniqueName: \"kubernetes.io/projected/87689cb4-4d4f-431a-958e-23385b3fa01d-kube-api-access-mcq6p\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.542212 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87689cb4-4d4f-431a-958e-23385b3fa01d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.609449 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hj5zq" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.619556 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fd85-account-create-update-c4m4q" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.680761 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lbhl4" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.688303 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-554gk" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.694701 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e438-account-create-update-pwcsf" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.780374 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t22dz\" (UniqueName: \"kubernetes.io/projected/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb-kube-api-access-t22dz\") pod \"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb\" (UID: \"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb\") " Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.780691 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slkb9\" (UniqueName: \"kubernetes.io/projected/df5a353b-3fec-411e-8508-3b440af66824-kube-api-access-slkb9\") pod \"df5a353b-3fec-411e-8508-3b440af66824\" (UID: \"df5a353b-3fec-411e-8508-3b440af66824\") " Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.780788 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ca09dc-ef65-4fec-8fea-eb06f96f3818-operator-scripts\") pod \"c0ca09dc-ef65-4fec-8fea-eb06f96f3818\" (UID: \"c0ca09dc-ef65-4fec-8fea-eb06f96f3818\") " Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.780908 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68msz\" (UniqueName: \"kubernetes.io/projected/c0ca09dc-ef65-4fec-8fea-eb06f96f3818-kube-api-access-68msz\") pod \"c0ca09dc-ef65-4fec-8fea-eb06f96f3818\" (UID: \"c0ca09dc-ef65-4fec-8fea-eb06f96f3818\") " Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.780973 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2e30bf-2042-4e8d-b465-f1cfe98b2678-operator-scripts\") pod \"6d2e30bf-2042-4e8d-b465-f1cfe98b2678\" (UID: \"6d2e30bf-2042-4e8d-b465-f1cfe98b2678\") " Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.781046 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/785b68ff-7aef-47b3-85d8-587116e8ce3a-operator-scripts\") pod \"785b68ff-7aef-47b3-85d8-587116e8ce3a\" (UID: \"785b68ff-7aef-47b3-85d8-587116e8ce3a\") " Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.781148 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5a353b-3fec-411e-8508-3b440af66824-operator-scripts\") pod \"df5a353b-3fec-411e-8508-3b440af66824\" (UID: \"df5a353b-3fec-411e-8508-3b440af66824\") " Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.781287 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5r49\" (UniqueName: \"kubernetes.io/projected/785b68ff-7aef-47b3-85d8-587116e8ce3a-kube-api-access-v5r49\") pod \"785b68ff-7aef-47b3-85d8-587116e8ce3a\" (UID: \"785b68ff-7aef-47b3-85d8-587116e8ce3a\") " Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.781379 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb-operator-scripts\") pod \"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb\" (UID: \"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb\") " Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.781478 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dxwm\" (UniqueName: \"kubernetes.io/projected/6d2e30bf-2042-4e8d-b465-f1cfe98b2678-kube-api-access-5dxwm\") pod \"6d2e30bf-2042-4e8d-b465-f1cfe98b2678\" (UID: \"6d2e30bf-2042-4e8d-b465-f1cfe98b2678\") " Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.786860 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d2e30bf-2042-4e8d-b465-f1cfe98b2678-kube-api-access-5dxwm" (OuterVolumeSpecName: "kube-api-access-5dxwm") pod "6d2e30bf-2042-4e8d-b465-f1cfe98b2678" (UID: "6d2e30bf-2042-4e8d-b465-f1cfe98b2678"). InnerVolumeSpecName "kube-api-access-5dxwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.787981 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d2e30bf-2042-4e8d-b465-f1cfe98b2678-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d2e30bf-2042-4e8d-b465-f1cfe98b2678" (UID: "6d2e30bf-2042-4e8d-b465-f1cfe98b2678"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.788991 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df5a353b-3fec-411e-8508-3b440af66824-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df5a353b-3fec-411e-8508-3b440af66824" (UID: "df5a353b-3fec-411e-8508-3b440af66824"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.789295 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0ca09dc-ef65-4fec-8fea-eb06f96f3818-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0ca09dc-ef65-4fec-8fea-eb06f96f3818" (UID: "c0ca09dc-ef65-4fec-8fea-eb06f96f3818"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.789530 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb-kube-api-access-t22dz" (OuterVolumeSpecName: "kube-api-access-t22dz") pod "085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb" (UID: "085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb"). InnerVolumeSpecName "kube-api-access-t22dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.789691 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb" (UID: "085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.790795 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/785b68ff-7aef-47b3-85d8-587116e8ce3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "785b68ff-7aef-47b3-85d8-587116e8ce3a" (UID: "785b68ff-7aef-47b3-85d8-587116e8ce3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.793644 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0ca09dc-ef65-4fec-8fea-eb06f96f3818-kube-api-access-68msz" (OuterVolumeSpecName: "kube-api-access-68msz") pod "c0ca09dc-ef65-4fec-8fea-eb06f96f3818" (UID: "c0ca09dc-ef65-4fec-8fea-eb06f96f3818"). InnerVolumeSpecName "kube-api-access-68msz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.793784 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/785b68ff-7aef-47b3-85d8-587116e8ce3a-kube-api-access-v5r49" (OuterVolumeSpecName: "kube-api-access-v5r49") pod "785b68ff-7aef-47b3-85d8-587116e8ce3a" (UID: "785b68ff-7aef-47b3-85d8-587116e8ce3a"). InnerVolumeSpecName "kube-api-access-v5r49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.794658 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5a353b-3fec-411e-8508-3b440af66824-kube-api-access-slkb9" (OuterVolumeSpecName: "kube-api-access-slkb9") pod "df5a353b-3fec-411e-8508-3b440af66824" (UID: "df5a353b-3fec-411e-8508-3b440af66824"). InnerVolumeSpecName "kube-api-access-slkb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.883759 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dxwm\" (UniqueName: \"kubernetes.io/projected/6d2e30bf-2042-4e8d-b465-f1cfe98b2678-kube-api-access-5dxwm\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.883787 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t22dz\" (UniqueName: \"kubernetes.io/projected/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb-kube-api-access-t22dz\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.883798 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slkb9\" (UniqueName: \"kubernetes.io/projected/df5a353b-3fec-411e-8508-3b440af66824-kube-api-access-slkb9\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.883807 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ca09dc-ef65-4fec-8fea-eb06f96f3818-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.883815 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68msz\" (UniqueName: \"kubernetes.io/projected/c0ca09dc-ef65-4fec-8fea-eb06f96f3818-kube-api-access-68msz\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.883824 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2e30bf-2042-4e8d-b465-f1cfe98b2678-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.883832 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/785b68ff-7aef-47b3-85d8-587116e8ce3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.883842 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5a353b-3fec-411e-8508-3b440af66824-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.883850 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5r49\" (UniqueName: \"kubernetes.io/projected/785b68ff-7aef-47b3-85d8-587116e8ce3a-kube-api-access-v5r49\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.883857 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.914350 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fd85-account-create-update-c4m4q" event={"ID":"df5a353b-3fec-411e-8508-3b440af66824","Type":"ContainerDied","Data":"c56cf50e70b5ef952b1e5284b8f22566cdaa499bb37e611b31c489ea63b13229"} Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.914390 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c56cf50e70b5ef952b1e5284b8f22566cdaa499bb37e611b31c489ea63b13229" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.914510 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fd85-account-create-update-c4m4q" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.917173 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lbhl4" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.917166 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lbhl4" event={"ID":"085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb","Type":"ContainerDied","Data":"d89b14c72208ed5afa08c2290ddef4507199703e0eac9f07ab94e0e419d1f7fa"} Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.917279 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d89b14c72208ed5afa08c2290ddef4507199703e0eac9f07ab94e0e419d1f7fa" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.922026 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-554gk" event={"ID":"785b68ff-7aef-47b3-85d8-587116e8ce3a","Type":"ContainerDied","Data":"4234c400e3bedaf4261f76696756ce41e76ec27efee3bdbfc398513580578f84"} Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.922052 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4234c400e3bedaf4261f76696756ce41e76ec27efee3bdbfc398513580578f84" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.922053 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-554gk" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.924608 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3eb9-account-create-update-ps2nd" event={"ID":"87689cb4-4d4f-431a-958e-23385b3fa01d","Type":"ContainerDied","Data":"275f9384758f306e5527beac8936004daae140601e842cb7973707a814df0293"} Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.924627 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="275f9384758f306e5527beac8936004daae140601e842cb7973707a814df0293" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.924675 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3eb9-account-create-update-ps2nd" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.926497 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hj5zq" event={"ID":"c0ca09dc-ef65-4fec-8fea-eb06f96f3818","Type":"ContainerDied","Data":"30219b86571cd46df2ab7d6d96cd343b07cd816d76b4300641855c0688258293"} Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.926522 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30219b86571cd46df2ab7d6d96cd343b07cd816d76b4300641855c0688258293" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.926563 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hj5zq" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.936508 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e438-account-create-update-pwcsf" event={"ID":"6d2e30bf-2042-4e8d-b465-f1cfe98b2678","Type":"ContainerDied","Data":"d1811a39284e5f5ec51581d69bfa05d29d1a7fa426a8cd568c865d507a06d3d4"} Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.936550 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1811a39284e5f5ec51581d69bfa05d29d1a7fa426a8cd568c865d507a06d3d4" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.936524 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e438-account-create-update-pwcsf" Feb 20 16:50:50 crc kubenswrapper[4697]: I0220 16:50:50.936632 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:51 crc kubenswrapper[4697]: I0220 16:50:51.801706 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:51 crc kubenswrapper[4697]: I0220 16:50:51.806645 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:51 crc kubenswrapper[4697]: I0220 16:50:51.947650 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 20 16:50:56 crc kubenswrapper[4697]: I0220 16:50:56.872245 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:50:56 crc kubenswrapper[4697]: I0220 16:50:56.938459 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8f7465f5f-wx7p7"] Feb 20 16:50:56 crc kubenswrapper[4697]: I0220 16:50:56.938737 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" podUID="ed52949a-bfb6-4102-bd9f-9ea591aeb295" containerName="dnsmasq-dns" containerID="cri-o://43bb90b7cde45f9dd32ed8f6eeba7d2e8821c7c7244d353a87bf75f0802a6a2c" gracePeriod=10 Feb 20 16:50:58 crc kubenswrapper[4697]: I0220 16:50:58.033154 4697 generic.go:334] "Generic (PLEG): container finished" podID="ed52949a-bfb6-4102-bd9f-9ea591aeb295" containerID="43bb90b7cde45f9dd32ed8f6eeba7d2e8821c7c7244d353a87bf75f0802a6a2c" exitCode=0 Feb 20 16:50:58 crc kubenswrapper[4697]: I0220 16:50:58.033321 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" event={"ID":"ed52949a-bfb6-4102-bd9f-9ea591aeb295","Type":"ContainerDied","Data":"43bb90b7cde45f9dd32ed8f6eeba7d2e8821c7c7244d353a87bf75f0802a6a2c"} Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.394004 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.452277 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-config\") pod \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.452663 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-dns-svc\") pod \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.452811 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-dns-swift-storage-0\") pod \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.452922 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xtpb\" (UniqueName: \"kubernetes.io/projected/ed52949a-bfb6-4102-bd9f-9ea591aeb295-kube-api-access-9xtpb\") pod \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.453057 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-ovsdbserver-nb\") pod \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.453190 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-ovsdbserver-sb\") pod \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\" (UID: \"ed52949a-bfb6-4102-bd9f-9ea591aeb295\") " Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.457390 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed52949a-bfb6-4102-bd9f-9ea591aeb295-kube-api-access-9xtpb" (OuterVolumeSpecName: "kube-api-access-9xtpb") pod "ed52949a-bfb6-4102-bd9f-9ea591aeb295" (UID: "ed52949a-bfb6-4102-bd9f-9ea591aeb295"). InnerVolumeSpecName "kube-api-access-9xtpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.548148 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-config" (OuterVolumeSpecName: "config") pod "ed52949a-bfb6-4102-bd9f-9ea591aeb295" (UID: "ed52949a-bfb6-4102-bd9f-9ea591aeb295"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.548559 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed52949a-bfb6-4102-bd9f-9ea591aeb295" (UID: "ed52949a-bfb6-4102-bd9f-9ea591aeb295"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.552146 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed52949a-bfb6-4102-bd9f-9ea591aeb295" (UID: "ed52949a-bfb6-4102-bd9f-9ea591aeb295"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.552165 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed52949a-bfb6-4102-bd9f-9ea591aeb295" (UID: "ed52949a-bfb6-4102-bd9f-9ea591aeb295"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.555055 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.555325 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.555334 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xtpb\" (UniqueName: \"kubernetes.io/projected/ed52949a-bfb6-4102-bd9f-9ea591aeb295-kube-api-access-9xtpb\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.555348 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.555356 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.557599 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ed52949a-bfb6-4102-bd9f-9ea591aeb295" (UID: "ed52949a-bfb6-4102-bd9f-9ea591aeb295"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:00 crc kubenswrapper[4697]: I0220 16:51:00.658343 4697 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed52949a-bfb6-4102-bd9f-9ea591aeb295-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:01 crc kubenswrapper[4697]: I0220 16:51:01.072119 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v6dg9" event={"ID":"52cf1f6d-5da1-4b45-9dda-dbd21e972107","Type":"ContainerStarted","Data":"49c1bdfa25b7858554685eee5a89282cf25449f7f201400f665d8733e5795731"} Feb 20 16:51:01 crc kubenswrapper[4697]: I0220 16:51:01.076112 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" event={"ID":"ed52949a-bfb6-4102-bd9f-9ea591aeb295","Type":"ContainerDied","Data":"7c17e45fb4317b3a9f37e3c07b68782043312bc2ccf550ffc9c9b2316c3bfa84"} Feb 20 16:51:01 crc kubenswrapper[4697]: I0220 16:51:01.076132 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8f7465f5f-wx7p7" Feb 20 16:51:01 crc kubenswrapper[4697]: I0220 16:51:01.076337 4697 scope.go:117] "RemoveContainer" containerID="43bb90b7cde45f9dd32ed8f6eeba7d2e8821c7c7244d353a87bf75f0802a6a2c" Feb 20 16:51:01 crc kubenswrapper[4697]: I0220 16:51:01.077459 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-f7c6f" event={"ID":"962571a7-edad-4e80-94b1-f8f1ba8621e4","Type":"ContainerStarted","Data":"501af9233f29b879b25f5ae5f15e15ebd10c2fc86ee1070a21ad8753d2d078f8"} Feb 20 16:51:01 crc kubenswrapper[4697]: I0220 16:51:01.100983 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-v6dg9" podStartSLOduration=2.622289258 podStartE2EDuration="15.10096459s" podCreationTimestamp="2026-02-20 16:50:46 +0000 UTC" firstStartedPulling="2026-02-20 16:50:47.824957659 +0000 UTC m=+1155.605003067" lastFinishedPulling="2026-02-20 16:51:00.303632991 +0000 UTC m=+1168.083678399" observedRunningTime="2026-02-20 16:51:01.094163063 +0000 UTC m=+1168.874208471" watchObservedRunningTime="2026-02-20 16:51:01.10096459 +0000 UTC m=+1168.881009998" Feb 20 16:51:01 crc kubenswrapper[4697]: I0220 16:51:01.106799 4697 scope.go:117] "RemoveContainer" containerID="5d59703e58aed3ea649faa9d298e8b65da7de8f6f3df345447fa59fa396a07df" Feb 20 16:51:01 crc kubenswrapper[4697]: I0220 16:51:01.110790 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8f7465f5f-wx7p7"] Feb 20 16:51:01 crc kubenswrapper[4697]: I0220 16:51:01.117582 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8f7465f5f-wx7p7"] Feb 20 16:51:01 crc kubenswrapper[4697]: I0220 16:51:01.130102 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-f7c6f" podStartSLOduration=2.7142314770000002 podStartE2EDuration="15.130082275s" podCreationTimestamp="2026-02-20 16:50:46 +0000 UTC" firstStartedPulling="2026-02-20 16:50:47.974803349 +0000 UTC m=+1155.754848757" lastFinishedPulling="2026-02-20 16:51:00.390654137 +0000 UTC m=+1168.170699555" observedRunningTime="2026-02-20 16:51:01.121331731 +0000 UTC m=+1168.901377149" watchObservedRunningTime="2026-02-20 16:51:01.130082275 +0000 UTC m=+1168.910127703" Feb 20 16:51:02 crc kubenswrapper[4697]: I0220 16:51:02.888087 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed52949a-bfb6-4102-bd9f-9ea591aeb295" path="/var/lib/kubelet/pods/ed52949a-bfb6-4102-bd9f-9ea591aeb295/volumes" Feb 20 16:51:04 crc kubenswrapper[4697]: I0220 16:51:04.103311 4697 generic.go:334] "Generic (PLEG): container finished" podID="962571a7-edad-4e80-94b1-f8f1ba8621e4" containerID="501af9233f29b879b25f5ae5f15e15ebd10c2fc86ee1070a21ad8753d2d078f8" exitCode=0 Feb 20 16:51:04 crc kubenswrapper[4697]: I0220 16:51:04.103381 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-f7c6f" event={"ID":"962571a7-edad-4e80-94b1-f8f1ba8621e4","Type":"ContainerDied","Data":"501af9233f29b879b25f5ae5f15e15ebd10c2fc86ee1070a21ad8753d2d078f8"} Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.119038 4697 generic.go:334] "Generic (PLEG): container finished" podID="52cf1f6d-5da1-4b45-9dda-dbd21e972107" containerID="49c1bdfa25b7858554685eee5a89282cf25449f7f201400f665d8733e5795731" exitCode=0 Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.119235 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v6dg9" event={"ID":"52cf1f6d-5da1-4b45-9dda-dbd21e972107","Type":"ContainerDied","Data":"49c1bdfa25b7858554685eee5a89282cf25449f7f201400f665d8733e5795731"} Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.551054 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.751791 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-db-sync-config-data\") pod \"962571a7-edad-4e80-94b1-f8f1ba8621e4\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.751932 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-config-data\") pod \"962571a7-edad-4e80-94b1-f8f1ba8621e4\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.752035 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v454r\" (UniqueName: \"kubernetes.io/projected/962571a7-edad-4e80-94b1-f8f1ba8621e4-kube-api-access-v454r\") pod \"962571a7-edad-4e80-94b1-f8f1ba8621e4\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.752083 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-combined-ca-bundle\") pod \"962571a7-edad-4e80-94b1-f8f1ba8621e4\" (UID: \"962571a7-edad-4e80-94b1-f8f1ba8621e4\") " Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.756804 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962571a7-edad-4e80-94b1-f8f1ba8621e4-kube-api-access-v454r" (OuterVolumeSpecName: "kube-api-access-v454r") pod "962571a7-edad-4e80-94b1-f8f1ba8621e4" (UID: "962571a7-edad-4e80-94b1-f8f1ba8621e4"). InnerVolumeSpecName "kube-api-access-v454r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.757291 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "962571a7-edad-4e80-94b1-f8f1ba8621e4" (UID: "962571a7-edad-4e80-94b1-f8f1ba8621e4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.795739 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "962571a7-edad-4e80-94b1-f8f1ba8621e4" (UID: "962571a7-edad-4e80-94b1-f8f1ba8621e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.799574 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-config-data" (OuterVolumeSpecName: "config-data") pod "962571a7-edad-4e80-94b1-f8f1ba8621e4" (UID: "962571a7-edad-4e80-94b1-f8f1ba8621e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.853805 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.853839 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v454r\" (UniqueName: \"kubernetes.io/projected/962571a7-edad-4e80-94b1-f8f1ba8621e4-kube-api-access-v454r\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.853853 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:05 crc kubenswrapper[4697]: I0220 16:51:05.853865 4697 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/962571a7-edad-4e80-94b1-f8f1ba8621e4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.128304 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-f7c6f" event={"ID":"962571a7-edad-4e80-94b1-f8f1ba8621e4","Type":"ContainerDied","Data":"9a2a47fa61e4683bab0bc85ab9bbeb18bed435a42defb8b7a1b0beabda0148ff"} Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.128637 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a2a47fa61e4683bab0bc85ab9bbeb18bed435a42defb8b7a1b0beabda0148ff" Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.128334 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-f7c6f" Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.471673 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.491960 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk44b\" (UniqueName: \"kubernetes.io/projected/52cf1f6d-5da1-4b45-9dda-dbd21e972107-kube-api-access-dk44b\") pod \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\" (UID: \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\") " Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.492002 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cf1f6d-5da1-4b45-9dda-dbd21e972107-config-data\") pod \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\" (UID: \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\") " Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.492063 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cf1f6d-5da1-4b45-9dda-dbd21e972107-combined-ca-bundle\") pod \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\" (UID: \"52cf1f6d-5da1-4b45-9dda-dbd21e972107\") " Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.500335 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52cf1f6d-5da1-4b45-9dda-dbd21e972107-kube-api-access-dk44b" (OuterVolumeSpecName: "kube-api-access-dk44b") pod "52cf1f6d-5da1-4b45-9dda-dbd21e972107" (UID: "52cf1f6d-5da1-4b45-9dda-dbd21e972107"). InnerVolumeSpecName "kube-api-access-dk44b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.520056 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cf1f6d-5da1-4b45-9dda-dbd21e972107-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52cf1f6d-5da1-4b45-9dda-dbd21e972107" (UID: "52cf1f6d-5da1-4b45-9dda-dbd21e972107"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.533916 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cf1f6d-5da1-4b45-9dda-dbd21e972107-config-data" (OuterVolumeSpecName: "config-data") pod "52cf1f6d-5da1-4b45-9dda-dbd21e972107" (UID: "52cf1f6d-5da1-4b45-9dda-dbd21e972107"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.593419 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk44b\" (UniqueName: \"kubernetes.io/projected/52cf1f6d-5da1-4b45-9dda-dbd21e972107-kube-api-access-dk44b\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.593459 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cf1f6d-5da1-4b45-9dda-dbd21e972107-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:06 crc kubenswrapper[4697]: I0220 16:51:06.593470 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cf1f6d-5da1-4b45-9dda-dbd21e972107-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.142983 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v6dg9" event={"ID":"52cf1f6d-5da1-4b45-9dda-dbd21e972107","Type":"ContainerDied","Data":"85b4ec61fe13bc52f3ed617b11b97d3a80e62da5b3d0f9fbaca9e13777c732f7"} Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.143045 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85b4ec61fe13bc52f3ed617b11b97d3a80e62da5b3d0f9fbaca9e13777c732f7" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.143052 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v6dg9" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.413463 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f86f8bdf5-h78wg"] Feb 20 16:51:07 crc kubenswrapper[4697]: E0220 16:51:07.413775 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ca09dc-ef65-4fec-8fea-eb06f96f3818" containerName="mariadb-database-create" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.413791 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ca09dc-ef65-4fec-8fea-eb06f96f3818" containerName="mariadb-database-create" Feb 20 16:51:07 crc kubenswrapper[4697]: E0220 16:51:07.413804 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5a353b-3fec-411e-8508-3b440af66824" containerName="mariadb-account-create-update" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.413811 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5a353b-3fec-411e-8508-3b440af66824" containerName="mariadb-account-create-update" Feb 20 16:51:07 crc kubenswrapper[4697]: E0220 16:51:07.413822 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2e30bf-2042-4e8d-b465-f1cfe98b2678" containerName="mariadb-account-create-update" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.413828 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2e30bf-2042-4e8d-b465-f1cfe98b2678" containerName="mariadb-account-create-update" Feb 20 16:51:07 crc kubenswrapper[4697]: E0220 16:51:07.413843 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962571a7-edad-4e80-94b1-f8f1ba8621e4" containerName="watcher-db-sync" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.413848 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="962571a7-edad-4e80-94b1-f8f1ba8621e4" containerName="watcher-db-sync" Feb 20 16:51:07 crc kubenswrapper[4697]: E0220 16:51:07.413859 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785b68ff-7aef-47b3-85d8-587116e8ce3a" containerName="mariadb-database-create" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.413865 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="785b68ff-7aef-47b3-85d8-587116e8ce3a" containerName="mariadb-database-create" Feb 20 16:51:07 crc kubenswrapper[4697]: E0220 16:51:07.413877 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed52949a-bfb6-4102-bd9f-9ea591aeb295" containerName="init" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.413884 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed52949a-bfb6-4102-bd9f-9ea591aeb295" containerName="init" Feb 20 16:51:07 crc kubenswrapper[4697]: E0220 16:51:07.413892 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed52949a-bfb6-4102-bd9f-9ea591aeb295" containerName="dnsmasq-dns" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.413898 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed52949a-bfb6-4102-bd9f-9ea591aeb295" containerName="dnsmasq-dns" Feb 20 16:51:07 crc kubenswrapper[4697]: E0220 16:51:07.413910 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87689cb4-4d4f-431a-958e-23385b3fa01d" containerName="mariadb-account-create-update" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.413915 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="87689cb4-4d4f-431a-958e-23385b3fa01d" containerName="mariadb-account-create-update" Feb 20 16:51:07 crc kubenswrapper[4697]: E0220 16:51:07.413927 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb" containerName="mariadb-database-create" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.413933 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb" containerName="mariadb-database-create" Feb 20 16:51:07 crc kubenswrapper[4697]: E0220 16:51:07.413941 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52cf1f6d-5da1-4b45-9dda-dbd21e972107" containerName="keystone-db-sync" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.413946 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cf1f6d-5da1-4b45-9dda-dbd21e972107" containerName="keystone-db-sync" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.414118 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="52cf1f6d-5da1-4b45-9dda-dbd21e972107" containerName="keystone-db-sync" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.414138 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed52949a-bfb6-4102-bd9f-9ea591aeb295" containerName="dnsmasq-dns" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.414161 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="785b68ff-7aef-47b3-85d8-587116e8ce3a" containerName="mariadb-database-create" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.414181 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d2e30bf-2042-4e8d-b465-f1cfe98b2678" containerName="mariadb-account-create-update" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.414198 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="962571a7-edad-4e80-94b1-f8f1ba8621e4" containerName="watcher-db-sync" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.414215 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb" containerName="mariadb-database-create" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.414227 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5a353b-3fec-411e-8508-3b440af66824" containerName="mariadb-account-create-update" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.414247 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="87689cb4-4d4f-431a-958e-23385b3fa01d" containerName="mariadb-account-create-update" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.414266 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0ca09dc-ef65-4fec-8fea-eb06f96f3818" containerName="mariadb-database-create" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.415246 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.430297 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f86f8bdf5-h78wg"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.501674 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-c7pf4"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.506039 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.512240 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.512469 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nd79" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.512588 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.512774 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.512939 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.515812 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-dns-swift-storage-0\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.515857 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-ovsdbserver-sb\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.515901 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-dns-svc\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.516030 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-config\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.516054 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz6g9\" (UniqueName: \"kubernetes.io/projected/c3a092e2-094c-41a2-add8-41c02dbeb9db-kube-api-access-bz6g9\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.516087 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-ovsdbserver-nb\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.527647 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c7pf4"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.600070 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.601287 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.608638 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.609484 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-4pjll" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617496 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-config-data\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617553 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-config\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617574 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz6g9\" (UniqueName: \"kubernetes.io/projected/c3a092e2-094c-41a2-add8-41c02dbeb9db-kube-api-access-bz6g9\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617603 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8abdcc99-bf45-4ca2-82b4-147b0a707333-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617619 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8abdcc99-bf45-4ca2-82b4-147b0a707333-config-data\") pod \"watcher-applier-0\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617638 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-ovsdbserver-nb\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617663 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-scripts\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617679 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8abdcc99-bf45-4ca2-82b4-147b0a707333-logs\") pod \"watcher-applier-0\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617723 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-fernet-keys\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617740 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-dns-swift-storage-0\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617759 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-ovsdbserver-sb\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617783 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nqkv\" (UniqueName: \"kubernetes.io/projected/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-kube-api-access-5nqkv\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617804 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-dns-svc\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617835 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-combined-ca-bundle\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617877 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-credential-keys\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.617900 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbbtp\" (UniqueName: \"kubernetes.io/projected/8abdcc99-bf45-4ca2-82b4-147b0a707333-kube-api-access-cbbtp\") pod \"watcher-applier-0\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.618885 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-config\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.619004 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-ovsdbserver-nb\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.619151 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-dns-swift-storage-0\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.619547 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-dns-svc\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.619570 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-ovsdbserver-sb\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.628496 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.629641 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.635899 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.655284 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz6g9\" (UniqueName: \"kubernetes.io/projected/c3a092e2-094c-41a2-add8-41c02dbeb9db-kube-api-access-bz6g9\") pod \"dnsmasq-dns-5f86f8bdf5-h78wg\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.666200 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.667649 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.673412 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.712531 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719521 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719566 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-fernet-keys\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719600 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nqkv\" (UniqueName: \"kubernetes.io/projected/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-kube-api-access-5nqkv\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719622 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-config-data\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719638 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbxhq\" (UniqueName: \"kubernetes.io/projected/aea295ce-e05d-46a9-9a39-94ffc4b29826-kube-api-access-tbxhq\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719662 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-logs\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719680 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-combined-ca-bundle\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719720 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719737 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-credential-keys\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719757 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbbtp\" (UniqueName: \"kubernetes.io/projected/8abdcc99-bf45-4ca2-82b4-147b0a707333-kube-api-access-cbbtp\") pod \"watcher-applier-0\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719799 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-config-data\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719834 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8abdcc99-bf45-4ca2-82b4-147b0a707333-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719856 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719876 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8abdcc99-bf45-4ca2-82b4-147b0a707333-config-data\") pod \"watcher-applier-0\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719897 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea295ce-e05d-46a9-9a39-94ffc4b29826-logs\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719923 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-scripts\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719939 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8abdcc99-bf45-4ca2-82b4-147b0a707333-logs\") pod \"watcher-applier-0\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719963 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7mp4\" (UniqueName: \"kubernetes.io/projected/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-kube-api-access-w7mp4\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.719984 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-config-data\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.720003 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.730604 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8abdcc99-bf45-4ca2-82b4-147b0a707333-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.732519 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8abdcc99-bf45-4ca2-82b4-147b0a707333-logs\") pod \"watcher-applier-0\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.740159 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.753577 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.753774 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.755639 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8abdcc99-bf45-4ca2-82b4-147b0a707333-config-data\") pod \"watcher-applier-0\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.763058 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-scripts\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.763514 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-fernet-keys\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.774063 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbbtp\" (UniqueName: \"kubernetes.io/projected/8abdcc99-bf45-4ca2-82b4-147b0a707333-kube-api-access-cbbtp\") pod \"watcher-applier-0\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.781046 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nqkv\" (UniqueName: \"kubernetes.io/projected/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-kube-api-access-5nqkv\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.790961 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-combined-ca-bundle\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.793448 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-config-data\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.799551 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-95ff577cc-sl4rs"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.800983 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.803101 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-credential-keys\") pod \"keystone-bootstrap-c7pf4\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.812127 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-95ff577cc-sl4rs"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.818853 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.819243 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.819715 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-2jt97" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.819938 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.821308 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-logs\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.821409 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dfa5c6d4-0e61-4a76-8154-275204545c7b-horizon-secret-key\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.821606 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.821690 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfa5c6d4-0e61-4a76-8154-275204545c7b-scripts\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.821781 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdmmd\" (UniqueName: \"kubernetes.io/projected/dfa5c6d4-0e61-4a76-8154-275204545c7b-kube-api-access-fdmmd\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.821848 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.821967 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea295ce-e05d-46a9-9a39-94ffc4b29826-logs\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.822057 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfa5c6d4-0e61-4a76-8154-275204545c7b-logs\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.822147 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7mp4\" (UniqueName: \"kubernetes.io/projected/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-kube-api-access-w7mp4\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.825671 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-logs\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.831349 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.831657 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea295ce-e05d-46a9-9a39-94ffc4b29826-logs\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.832185 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.832713 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.837449 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-config-data\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.837551 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.837623 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfa5c6d4-0e61-4a76-8154-275204545c7b-config-data\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.837713 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.837815 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-config-data\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.837877 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbxhq\" (UniqueName: \"kubernetes.io/projected/aea295ce-e05d-46a9-9a39-94ffc4b29826-kube-api-access-tbxhq\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.843475 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-config-data\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.858361 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.868788 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7mp4\" (UniqueName: \"kubernetes.io/projected/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-kube-api-access-w7mp4\") pod \"watcher-api-0\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.874983 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-config-data\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.875546 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.881092 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbxhq\" (UniqueName: \"kubernetes.io/projected/aea295ce-e05d-46a9-9a39-94ffc4b29826-kube-api-access-tbxhq\") pod \"watcher-decision-engine-0\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.881170 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-k8pk7"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.882259 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.888347 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.888515 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zwpk7" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.888580 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.905030 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-k8pk7"] Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.905244 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.920513 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.939815 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfa5c6d4-0e61-4a76-8154-275204545c7b-scripts\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.939879 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdmmd\" (UniqueName: \"kubernetes.io/projected/dfa5c6d4-0e61-4a76-8154-275204545c7b-kube-api-access-fdmmd\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.939928 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxd2h\" (UniqueName: \"kubernetes.io/projected/5959da8c-890a-4acb-9781-71b7c9fb33a5-kube-api-access-sxd2h\") pod \"neutron-db-sync-k8pk7\" (UID: \"5959da8c-890a-4acb-9781-71b7c9fb33a5\") " pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.939986 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfa5c6d4-0e61-4a76-8154-275204545c7b-logs\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.940055 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfa5c6d4-0e61-4a76-8154-275204545c7b-config-data\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.940122 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5959da8c-890a-4acb-9781-71b7c9fb33a5-combined-ca-bundle\") pod \"neutron-db-sync-k8pk7\" (UID: \"5959da8c-890a-4acb-9781-71b7c9fb33a5\") " pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.940177 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dfa5c6d4-0e61-4a76-8154-275204545c7b-horizon-secret-key\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.940250 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5959da8c-890a-4acb-9781-71b7c9fb33a5-config\") pod \"neutron-db-sync-k8pk7\" (UID: \"5959da8c-890a-4acb-9781-71b7c9fb33a5\") " pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.941700 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfa5c6d4-0e61-4a76-8154-275204545c7b-scripts\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.943646 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfa5c6d4-0e61-4a76-8154-275204545c7b-logs\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.945965 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfa5c6d4-0e61-4a76-8154-275204545c7b-config-data\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.964623 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dfa5c6d4-0e61-4a76-8154-275204545c7b-horizon-secret-key\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.970808 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdmmd\" (UniqueName: \"kubernetes.io/projected/dfa5c6d4-0e61-4a76-8154-275204545c7b-kube-api-access-fdmmd\") pod \"horizon-95ff577cc-sl4rs\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.997074 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 20 16:51:07 crc kubenswrapper[4697]: I0220 16:51:07.993773 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-dzcml"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.047928 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxd2h\" (UniqueName: \"kubernetes.io/projected/5959da8c-890a-4acb-9781-71b7c9fb33a5-kube-api-access-sxd2h\") pod \"neutron-db-sync-k8pk7\" (UID: \"5959da8c-890a-4acb-9781-71b7c9fb33a5\") " pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.048926 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5959da8c-890a-4acb-9781-71b7c9fb33a5-combined-ca-bundle\") pod \"neutron-db-sync-k8pk7\" (UID: \"5959da8c-890a-4acb-9781-71b7c9fb33a5\") " pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.050038 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5959da8c-890a-4acb-9781-71b7c9fb33a5-config\") pod \"neutron-db-sync-k8pk7\" (UID: \"5959da8c-890a-4acb-9781-71b7c9fb33a5\") " pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.051524 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.079326 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.097161 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.102191 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m2cjp" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.102720 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.103044 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.103629 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5959da8c-890a-4acb-9781-71b7c9fb33a5-config\") pod \"neutron-db-sync-k8pk7\" (UID: \"5959da8c-890a-4acb-9781-71b7c9fb33a5\") " pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.109144 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.109541 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.128334 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5959da8c-890a-4acb-9781-71b7c9fb33a5-combined-ca-bundle\") pod \"neutron-db-sync-k8pk7\" (UID: \"5959da8c-890a-4acb-9781-71b7c9fb33a5\") " pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.128967 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxd2h\" (UniqueName: \"kubernetes.io/projected/5959da8c-890a-4acb-9781-71b7c9fb33a5-kube-api-access-sxd2h\") pod \"neutron-db-sync-k8pk7\" (UID: \"5959da8c-890a-4acb-9781-71b7c9fb33a5\") " pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.159570 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dzcml"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.184827 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.203547 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-hjgnj"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.204828 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.205505 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hjgnj"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.212981 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-lv84x"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.214330 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.224600 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.224740 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-k2bh7" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.225062 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.225311 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2kwkr" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.225473 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.228697 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.231565 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f86f8bdf5-h78wg"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.252027 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-lv84x"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.252459 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.258111 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-combined-ca-bundle\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.258991 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-scripts\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.259016 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bc8e40c-20d6-41e9-9f4e-25112a77e115-etc-machine-id\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.259049 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g885n\" (UniqueName: \"kubernetes.io/projected/4bc8e40c-20d6-41e9-9f4e-25112a77e115-kube-api-access-g885n\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.259067 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-scripts\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.259084 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drptr\" (UniqueName: \"kubernetes.io/projected/b988e030-0e85-401e-bd22-057f9c2de43d-kube-api-access-drptr\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.259106 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-config-data\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.259123 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.259143 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-config-data\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.259163 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-db-sync-config-data\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.259205 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b988e030-0e85-401e-bd22-057f9c2de43d-log-httpd\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.259233 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b988e030-0e85-401e-bd22-057f9c2de43d-run-httpd\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.259290 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.273625 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dc676dfcf-zstfx"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.276818 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.303524 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-ffc9c47b9-vz9sg"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.306391 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.318036 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc676dfcf-zstfx"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.324578 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-ffc9c47b9-vz9sg"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.349262 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.351890 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.353470 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qzcv5" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.357941 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.358113 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.358509 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360584 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360632 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-scripts\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360654 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-config-data\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360671 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-combined-ca-bundle\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360693 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-scripts\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360714 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bc8e40c-20d6-41e9-9f4e-25112a77e115-etc-machine-id\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360732 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-combined-ca-bundle\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360756 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360781 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g885n\" (UniqueName: \"kubernetes.io/projected/4bc8e40c-20d6-41e9-9f4e-25112a77e115-kube-api-access-g885n\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360799 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-scripts\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360817 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drptr\" (UniqueName: \"kubernetes.io/projected/b988e030-0e85-401e-bd22-057f9c2de43d-kube-api-access-drptr\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360836 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b22hw\" (UniqueName: \"kubernetes.io/projected/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-kube-api-access-b22hw\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360858 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-config-data\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360875 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360893 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbgs2\" (UniqueName: \"kubernetes.io/projected/44731724-9f5b-4193-82c9-4233d91bac74-kube-api-access-tbgs2\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360913 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-config\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360933 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-config-data\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360953 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-db-sync-config-data\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360979 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5f9j\" (UniqueName: \"kubernetes.io/projected/39238d29-1c84-4169-b011-455bd2e7f000-kube-api-access-v5f9j\") pod \"barbican-db-sync-hjgnj\" (UID: \"39238d29-1c84-4169-b011-455bd2e7f000\") " pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.360996 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39238d29-1c84-4169-b011-455bd2e7f000-combined-ca-bundle\") pod \"barbican-db-sync-hjgnj\" (UID: \"39238d29-1c84-4169-b011-455bd2e7f000\") " pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.361019 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/39238d29-1c84-4169-b011-455bd2e7f000-db-sync-config-data\") pod \"barbican-db-sync-hjgnj\" (UID: \"39238d29-1c84-4169-b011-455bd2e7f000\") " pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.361065 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-dns-svc\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.361081 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-dns-swift-storage-0\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.361100 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b988e030-0e85-401e-bd22-057f9c2de43d-log-httpd\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.361126 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b988e030-0e85-401e-bd22-057f9c2de43d-run-httpd\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.361156 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-logs\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.361177 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.362379 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bc8e40c-20d6-41e9-9f4e-25112a77e115-etc-machine-id\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.366466 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b988e030-0e85-401e-bd22-057f9c2de43d-log-httpd\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.366645 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b988e030-0e85-401e-bd22-057f9c2de43d-run-httpd\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.395884 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.406032 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g885n\" (UniqueName: \"kubernetes.io/projected/4bc8e40c-20d6-41e9-9f4e-25112a77e115-kube-api-access-g885n\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.406545 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-scripts\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.409024 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-db-sync-config-data\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.420976 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-combined-ca-bundle\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.421201 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-scripts\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.421717 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-config-data\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.423893 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drptr\" (UniqueName: \"kubernetes.io/projected/b988e030-0e85-401e-bd22-057f9c2de43d-kube-api-access-drptr\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.429304 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.430005 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.431522 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.433951 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.434345 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.438704 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.441207 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-config-data\") pod \"cinder-db-sync-dzcml\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.449355 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462221 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-combined-ca-bundle\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462266 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462303 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b22hw\" (UniqueName: \"kubernetes.io/projected/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-kube-api-access-b22hw\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462333 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462351 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbgs2\" (UniqueName: \"kubernetes.io/projected/44731724-9f5b-4193-82c9-4233d91bac74-kube-api-access-tbgs2\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462369 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-config\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462389 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wghq\" (UniqueName: \"kubernetes.io/projected/6f7e5f6d-970f-4773-8a37-bb831984fc44-kube-api-access-4wghq\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462410 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462449 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5f9j\" (UniqueName: \"kubernetes.io/projected/39238d29-1c84-4169-b011-455bd2e7f000-kube-api-access-v5f9j\") pod \"barbican-db-sync-hjgnj\" (UID: \"39238d29-1c84-4169-b011-455bd2e7f000\") " pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462464 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1083350-cbcc-4cf4-a927-a1b63029ffbe-config-data\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462482 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39238d29-1c84-4169-b011-455bd2e7f000-combined-ca-bundle\") pod \"barbican-db-sync-hjgnj\" (UID: \"39238d29-1c84-4169-b011-455bd2e7f000\") " pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462498 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1083350-cbcc-4cf4-a927-a1b63029ffbe-scripts\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462542 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f7e5f6d-970f-4773-8a37-bb831984fc44-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462563 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/39238d29-1c84-4169-b011-455bd2e7f000-db-sync-config-data\") pod \"barbican-db-sync-hjgnj\" (UID: \"39238d29-1c84-4169-b011-455bd2e7f000\") " pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462578 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-dns-swift-storage-0\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462593 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-dns-svc\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462620 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7e5f6d-970f-4773-8a37-bb831984fc44-logs\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462640 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-config-data\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462668 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462692 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1083350-cbcc-4cf4-a927-a1b63029ffbe-logs\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462719 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-logs\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462747 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462771 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-scripts\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462804 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn4q4\" (UniqueName: \"kubernetes.io/projected/a1083350-cbcc-4cf4-a927-a1b63029ffbe-kube-api-access-vn4q4\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462822 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-scripts\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462841 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-config-data\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.462865 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1083350-cbcc-4cf4-a927-a1b63029ffbe-horizon-secret-key\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.463673 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.463784 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-config\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.464395 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-logs\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.464914 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.465402 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-dns-svc\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.466003 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-dns-swift-storage-0\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.474195 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-combined-ca-bundle\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.474208 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39238d29-1c84-4169-b011-455bd2e7f000-combined-ca-bundle\") pod \"barbican-db-sync-hjgnj\" (UID: \"39238d29-1c84-4169-b011-455bd2e7f000\") " pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.476891 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/39238d29-1c84-4169-b011-455bd2e7f000-db-sync-config-data\") pod \"barbican-db-sync-hjgnj\" (UID: \"39238d29-1c84-4169-b011-455bd2e7f000\") " pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.480401 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-config-data\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.481076 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-scripts\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.488681 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5f9j\" (UniqueName: \"kubernetes.io/projected/39238d29-1c84-4169-b011-455bd2e7f000-kube-api-access-v5f9j\") pod \"barbican-db-sync-hjgnj\" (UID: \"39238d29-1c84-4169-b011-455bd2e7f000\") " pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.488951 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.496327 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbgs2\" (UniqueName: \"kubernetes.io/projected/44731724-9f5b-4193-82c9-4233d91bac74-kube-api-access-tbgs2\") pod \"dnsmasq-dns-6dc676dfcf-zstfx\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.505941 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b22hw\" (UniqueName: \"kubernetes.io/projected/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-kube-api-access-b22hw\") pod \"placement-db-sync-lv84x\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.552692 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.563783 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.563845 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.563873 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wghq\" (UniqueName: \"kubernetes.io/projected/6f7e5f6d-970f-4773-8a37-bb831984fc44-kube-api-access-4wghq\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.563915 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.563948 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/62c15b1d-702d-4d85-93d3-c949ae8e421e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.563964 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.563992 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1083350-cbcc-4cf4-a927-a1b63029ffbe-config-data\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564010 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1083350-cbcc-4cf4-a927-a1b63029ffbe-scripts\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564032 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f7e5f6d-970f-4773-8a37-bb831984fc44-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564048 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564067 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564090 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7e5f6d-970f-4773-8a37-bb831984fc44-logs\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564109 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-config-data\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564131 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564149 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564167 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1083350-cbcc-4cf4-a927-a1b63029ffbe-logs\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564183 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-457qx\" (UniqueName: \"kubernetes.io/projected/62c15b1d-702d-4d85-93d3-c949ae8e421e-kube-api-access-457qx\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564210 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62c15b1d-702d-4d85-93d3-c949ae8e421e-logs\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564495 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-scripts\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564556 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn4q4\" (UniqueName: \"kubernetes.io/projected/a1083350-cbcc-4cf4-a927-a1b63029ffbe-kube-api-access-vn4q4\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.564585 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1083350-cbcc-4cf4-a927-a1b63029ffbe-horizon-secret-key\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.565687 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f7e5f6d-970f-4773-8a37-bb831984fc44-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.566110 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.567030 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1083350-cbcc-4cf4-a927-a1b63029ffbe-config-data\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.567488 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7e5f6d-970f-4773-8a37-bb831984fc44-logs\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.572319 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.573084 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1083350-cbcc-4cf4-a927-a1b63029ffbe-scripts\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.573881 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-scripts\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.575724 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1083350-cbcc-4cf4-a927-a1b63029ffbe-logs\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.578636 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1083350-cbcc-4cf4-a927-a1b63029ffbe-horizon-secret-key\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.586607 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-config-data\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.586959 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wghq\" (UniqueName: \"kubernetes.io/projected/6f7e5f6d-970f-4773-8a37-bb831984fc44-kube-api-access-4wghq\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.586999 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.596050 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn4q4\" (UniqueName: \"kubernetes.io/projected/a1083350-cbcc-4cf4-a927-a1b63029ffbe-kube-api-access-vn4q4\") pod \"horizon-ffc9c47b9-vz9sg\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.607710 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.608392 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.634006 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.665986 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.666024 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/62c15b1d-702d-4d85-93d3-c949ae8e421e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.666079 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.666100 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.666159 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.666182 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-457qx\" (UniqueName: \"kubernetes.io/projected/62c15b1d-702d-4d85-93d3-c949ae8e421e-kube-api-access-457qx\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.666208 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62c15b1d-702d-4d85-93d3-c949ae8e421e-logs\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.666311 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.671943 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.675056 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.676732 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.677263 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.680014 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/62c15b1d-702d-4d85-93d3-c949ae8e421e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.685693 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62c15b1d-702d-4d85-93d3-c949ae8e421e-logs\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.698805 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.717895 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-457qx\" (UniqueName: \"kubernetes.io/projected/62c15b1d-702d-4d85-93d3-c949ae8e421e-kube-api-access-457qx\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.739262 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dzcml" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.775997 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.814880 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.839042 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 16:51:08 crc kubenswrapper[4697]: I0220 16:51:08.865014 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:09 crc kubenswrapper[4697]: I0220 16:51:09.035451 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f86f8bdf5-h78wg"] Feb 20 16:51:09 crc kubenswrapper[4697]: I0220 16:51:09.035486 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:51:09 crc kubenswrapper[4697]: I0220 16:51:09.453507 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c7pf4"] Feb 20 16:51:09 crc kubenswrapper[4697]: I0220 16:51:09.880291 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:51:09 crc kubenswrapper[4697]: W0220 16:51:09.892992 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8abdcc99_bf45_4ca2_82b4_147b0a707333.slice/crio-fe3228b2d1a472c18ab9b7a1c7805bb9948e2c1adfc9ec8e28e73d9ed7066b68 WatchSource:0}: Error finding container fe3228b2d1a472c18ab9b7a1c7805bb9948e2c1adfc9ec8e28e73d9ed7066b68: Status 404 returned error can't find the container with id fe3228b2d1a472c18ab9b7a1c7805bb9948e2c1adfc9ec8e28e73d9ed7066b68 Feb 20 16:51:09 crc kubenswrapper[4697]: I0220 16:51:09.894941 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.206511 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c","Type":"ContainerStarted","Data":"6c2e9df137e85663d7cfd8f310d671bd6722b52bf17c85f0f108a061f532e053"} Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.206829 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c","Type":"ContainerStarted","Data":"b8890a01ec64cc0fbff8d602794046a752a097b8f0c492fe4df76dd9e68fe7b7"} Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.206839 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c","Type":"ContainerStarted","Data":"a699bc8ae830021a9daaa37a8951b22b9fce83cf70a26647539393eb718427c5"} Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.207463 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.211618 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": dial tcp 10.217.0.155:9322: connect: connection refused" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.215489 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"8abdcc99-bf45-4ca2-82b4-147b0a707333","Type":"ContainerStarted","Data":"fe3228b2d1a472c18ab9b7a1c7805bb9948e2c1adfc9ec8e28e73d9ed7066b68"} Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.229280 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aea295ce-e05d-46a9-9a39-94ffc4b29826","Type":"ContainerStarted","Data":"d96be22343c3b2e35336818b837c98edc74a777a76b0ff819329a21ddc3c4c3e"} Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.233399 4697 generic.go:334] "Generic (PLEG): container finished" podID="c3a092e2-094c-41a2-add8-41c02dbeb9db" containerID="30bfa6abb2edaf4e3a02f67777c65213f67bb8e38caea70c062edb7c7339bd74" exitCode=0 Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.233470 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" event={"ID":"c3a092e2-094c-41a2-add8-41c02dbeb9db","Type":"ContainerDied","Data":"30bfa6abb2edaf4e3a02f67777c65213f67bb8e38caea70c062edb7c7339bd74"} Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.233493 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" event={"ID":"c3a092e2-094c-41a2-add8-41c02dbeb9db","Type":"ContainerStarted","Data":"200f7543625db88f28daba8d4bed3699c42190e26ee1c8026ae6d8667f942990"} Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.235345 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7pf4" event={"ID":"581f9b28-e6b4-41c2-ab2b-d3e9392615f2","Type":"ContainerStarted","Data":"2792f05136401783af589827b871e78e478603352525ff27d3ed9941cd730bfd"} Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.235364 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7pf4" event={"ID":"581f9b28-e6b4-41c2-ab2b-d3e9392615f2","Type":"ContainerStarted","Data":"c9b2e5d50c61651af8f3593b90988f6b71e2ee323b711b816a3b603120017a0c"} Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.242368 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.24234445 podStartE2EDuration="3.24234445s" podCreationTimestamp="2026-02-20 16:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:10.229277419 +0000 UTC m=+1178.009322827" watchObservedRunningTime="2026-02-20 16:51:10.24234445 +0000 UTC m=+1178.022389858" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.303649 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hjgnj"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.339634 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:51:10 crc kubenswrapper[4697]: W0220 16:51:10.344808 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb988e030_0e85_401e_bd22_057f9c2de43d.slice/crio-322cdfcd049df9e9090d84e0fbfeda566988bf1a2b7d712d72106dc88bc2c126 WatchSource:0}: Error finding container 322cdfcd049df9e9090d84e0fbfeda566988bf1a2b7d712d72106dc88bc2c126: Status 404 returned error can't find the container with id 322cdfcd049df9e9090d84e0fbfeda566988bf1a2b7d712d72106dc88bc2c126 Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.356510 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-c7pf4" podStartSLOduration=3.356486043 podStartE2EDuration="3.356486043s" podCreationTimestamp="2026-02-20 16:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:10.30669195 +0000 UTC m=+1178.086737358" watchObservedRunningTime="2026-02-20 16:51:10.356486043 +0000 UTC m=+1178.136531471" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.436413 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-95ff577cc-sl4rs"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.484536 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-lv84x"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.537426 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-k8pk7"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.575154 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.603891 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:51:10 crc kubenswrapper[4697]: W0220 16:51:10.629378 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1083350_cbcc_4cf4_a927_a1b63029ffbe.slice/crio-7f949df40f10fd2fdd291867f87d15a03bd8b7f07e2c805f4784f70a7ab0da9c WatchSource:0}: Error finding container 7f949df40f10fd2fdd291867f87d15a03bd8b7f07e2c805f4784f70a7ab0da9c: Status 404 returned error can't find the container with id 7f949df40f10fd2fdd291867f87d15a03bd8b7f07e2c805f4784f70a7ab0da9c Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.637008 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-95ff577cc-sl4rs"] Feb 20 16:51:10 crc kubenswrapper[4697]: W0220 16:51:10.671728 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bc8e40c_20d6_41e9_9f4e_25112a77e115.slice/crio-74657d3f189c4afdb55290a623f314e1b7eec4efe43a1a5dc188607aebcafe58 WatchSource:0}: Error finding container 74657d3f189c4afdb55290a623f314e1b7eec4efe43a1a5dc188607aebcafe58: Status 404 returned error can't find the container with id 74657d3f189c4afdb55290a623f314e1b7eec4efe43a1a5dc188607aebcafe58 Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.678588 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc676dfcf-zstfx"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.687227 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-679c794b85-nbtsc"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.689025 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.701030 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-ffc9c47b9-vz9sg"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.715911 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.725074 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-679c794b85-nbtsc"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.732845 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dzcml"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.741562 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:51:10 crc kubenswrapper[4697]: W0220 16:51:10.745493 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f7e5f6d_970f_4773_8a37_bb831984fc44.slice/crio-aea2fe477858d26298ef259e688d9a48b10625a09c4f9b4a755f3ea8eeada7a5 WatchSource:0}: Error finding container aea2fe477858d26298ef259e688d9a48b10625a09c4f9b4a755f3ea8eeada7a5: Status 404 returned error can't find the container with id aea2fe477858d26298ef259e688d9a48b10625a09c4f9b4a755f3ea8eeada7a5 Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.759223 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38688b72-7aff-4efe-988f-aa147e5865e2-scripts\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.759270 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38688b72-7aff-4efe-988f-aa147e5865e2-logs\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.759300 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/38688b72-7aff-4efe-988f-aa147e5865e2-horizon-secret-key\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.759353 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqsjj\" (UniqueName: \"kubernetes.io/projected/38688b72-7aff-4efe-988f-aa147e5865e2-kube-api-access-dqsjj\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.759371 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38688b72-7aff-4efe-988f-aa147e5865e2-config-data\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.863151 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38688b72-7aff-4efe-988f-aa147e5865e2-scripts\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.863207 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38688b72-7aff-4efe-988f-aa147e5865e2-logs\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.863237 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/38688b72-7aff-4efe-988f-aa147e5865e2-horizon-secret-key\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.863290 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqsjj\" (UniqueName: \"kubernetes.io/projected/38688b72-7aff-4efe-988f-aa147e5865e2-kube-api-access-dqsjj\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.863307 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38688b72-7aff-4efe-988f-aa147e5865e2-config-data\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.864128 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38688b72-7aff-4efe-988f-aa147e5865e2-scripts\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.865584 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38688b72-7aff-4efe-988f-aa147e5865e2-config-data\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.867915 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38688b72-7aff-4efe-988f-aa147e5865e2-logs\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.867953 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.872279 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/38688b72-7aff-4efe-988f-aa147e5865e2-horizon-secret-key\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.930369 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:51:10 crc kubenswrapper[4697]: I0220 16:51:10.946270 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqsjj\" (UniqueName: \"kubernetes.io/projected/38688b72-7aff-4efe-988f-aa147e5865e2-kube-api-access-dqsjj\") pod \"horizon-679c794b85-nbtsc\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.007217 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.350140 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"62c15b1d-702d-4d85-93d3-c949ae8e421e","Type":"ContainerStarted","Data":"5265c1c686daae3d2de42e92d461d9552aa3529efbe6b83f290bc643e85bf2fa"} Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.372485 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lv84x" event={"ID":"d1aba0bf-7e2c-4a14-894d-d1247f7356eb","Type":"ContainerStarted","Data":"196a85827544d29dc5265edee8f288ca5cf52ce433e549303d14caaf230f5cb5"} Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.387789 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.396682 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95ff577cc-sl4rs" event={"ID":"dfa5c6d4-0e61-4a76-8154-275204545c7b","Type":"ContainerStarted","Data":"86aae0a2593b8caabca2a207a546f404b4daa9a06b08b5ebde8e2adaecd8a61b"} Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.454602 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ffc9c47b9-vz9sg" event={"ID":"a1083350-cbcc-4cf4-a927-a1b63029ffbe","Type":"ContainerStarted","Data":"7f949df40f10fd2fdd291867f87d15a03bd8b7f07e2c805f4784f70a7ab0da9c"} Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.469311 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k8pk7" event={"ID":"5959da8c-890a-4acb-9781-71b7c9fb33a5","Type":"ContainerStarted","Data":"4e259e69a16913e6246d8feb28c4b01bb4b9d46e33d79bc0dfc5a5b22dd8a3d3"} Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.475508 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dzcml" event={"ID":"4bc8e40c-20d6-41e9-9f4e-25112a77e115","Type":"ContainerStarted","Data":"74657d3f189c4afdb55290a623f314e1b7eec4efe43a1a5dc188607aebcafe58"} Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.493584 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b988e030-0e85-401e-bd22-057f9c2de43d","Type":"ContainerStarted","Data":"322cdfcd049df9e9090d84e0fbfeda566988bf1a2b7d712d72106dc88bc2c126"} Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.496284 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" event={"ID":"44731724-9f5b-4193-82c9-4233d91bac74","Type":"ContainerStarted","Data":"128c542707cb566330e27c0560bddb65a8272d4fcd486b7db631fd5647d26507"} Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.502546 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f7e5f6d-970f-4773-8a37-bb831984fc44","Type":"ContainerStarted","Data":"aea2fe477858d26298ef259e688d9a48b10625a09c4f9b4a755f3ea8eeada7a5"} Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.510064 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hjgnj" event={"ID":"39238d29-1c84-4169-b011-455bd2e7f000","Type":"ContainerStarted","Data":"add3a53fcaca8909dc0762cd481a186b768d21345db5267f4b314cc0050f23f4"} Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.587585 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-dns-swift-storage-0\") pod \"c3a092e2-094c-41a2-add8-41c02dbeb9db\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.587710 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-ovsdbserver-nb\") pod \"c3a092e2-094c-41a2-add8-41c02dbeb9db\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.587751 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-config\") pod \"c3a092e2-094c-41a2-add8-41c02dbeb9db\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.587769 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-ovsdbserver-sb\") pod \"c3a092e2-094c-41a2-add8-41c02dbeb9db\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.587805 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz6g9\" (UniqueName: \"kubernetes.io/projected/c3a092e2-094c-41a2-add8-41c02dbeb9db-kube-api-access-bz6g9\") pod \"c3a092e2-094c-41a2-add8-41c02dbeb9db\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.587853 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-dns-svc\") pod \"c3a092e2-094c-41a2-add8-41c02dbeb9db\" (UID: \"c3a092e2-094c-41a2-add8-41c02dbeb9db\") " Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.611570 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a092e2-094c-41a2-add8-41c02dbeb9db-kube-api-access-bz6g9" (OuterVolumeSpecName: "kube-api-access-bz6g9") pod "c3a092e2-094c-41a2-add8-41c02dbeb9db" (UID: "c3a092e2-094c-41a2-add8-41c02dbeb9db"). InnerVolumeSpecName "kube-api-access-bz6g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.626191 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-config" (OuterVolumeSpecName: "config") pod "c3a092e2-094c-41a2-add8-41c02dbeb9db" (UID: "c3a092e2-094c-41a2-add8-41c02dbeb9db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.639997 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3a092e2-094c-41a2-add8-41c02dbeb9db" (UID: "c3a092e2-094c-41a2-add8-41c02dbeb9db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.662460 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c3a092e2-094c-41a2-add8-41c02dbeb9db" (UID: "c3a092e2-094c-41a2-add8-41c02dbeb9db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.662923 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c3a092e2-094c-41a2-add8-41c02dbeb9db" (UID: "c3a092e2-094c-41a2-add8-41c02dbeb9db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.666152 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c3a092e2-094c-41a2-add8-41c02dbeb9db" (UID: "c3a092e2-094c-41a2-add8-41c02dbeb9db"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.692524 4697 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.692551 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.692563 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.692571 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.692580 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz6g9\" (UniqueName: \"kubernetes.io/projected/c3a092e2-094c-41a2-add8-41c02dbeb9db-kube-api-access-bz6g9\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.692590 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3a092e2-094c-41a2-add8-41c02dbeb9db-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:11 crc kubenswrapper[4697]: I0220 16:51:11.879187 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-679c794b85-nbtsc"] Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.535519 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" event={"ID":"c3a092e2-094c-41a2-add8-41c02dbeb9db","Type":"ContainerDied","Data":"200f7543625db88f28daba8d4bed3699c42190e26ee1c8026ae6d8667f942990"} Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.535809 4697 scope.go:117] "RemoveContainer" containerID="30bfa6abb2edaf4e3a02f67777c65213f67bb8e38caea70c062edb7c7339bd74" Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.535794 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f86f8bdf5-h78wg" Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.545522 4697 generic.go:334] "Generic (PLEG): container finished" podID="44731724-9f5b-4193-82c9-4233d91bac74" containerID="5d06133e9dc95c94283a23ded1a16c174e639b328ea08d89b993cb3e04a5ab75" exitCode=0 Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.545603 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" event={"ID":"44731724-9f5b-4193-82c9-4233d91bac74","Type":"ContainerDied","Data":"5d06133e9dc95c94283a23ded1a16c174e639b328ea08d89b993cb3e04a5ab75"} Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.559245 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f7e5f6d-970f-4773-8a37-bb831984fc44","Type":"ContainerStarted","Data":"1ca32382fd227cc2c6e15cab981256bde06320af92c5c67127b470fe5c3a577e"} Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.571322 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k8pk7" event={"ID":"5959da8c-890a-4acb-9781-71b7c9fb33a5","Type":"ContainerStarted","Data":"1787c7c15684669833a35fc56d74a3c79b5bff5b74c598b121e187fb68c5191d"} Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.571365 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api-log" containerID="cri-o://b8890a01ec64cc0fbff8d602794046a752a097b8f0c492fe4df76dd9e68fe7b7" gracePeriod=30 Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.571512 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api" containerID="cri-o://6c2e9df137e85663d7cfd8f310d671bd6722b52bf17c85f0f108a061f532e053" gracePeriod=30 Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.611917 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": EOF" Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.623027 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-k8pk7" podStartSLOduration=5.623007821 podStartE2EDuration="5.623007821s" podCreationTimestamp="2026-02-20 16:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:12.606747772 +0000 UTC m=+1180.386793250" watchObservedRunningTime="2026-02-20 16:51:12.623007821 +0000 UTC m=+1180.403053229" Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.671149 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f86f8bdf5-h78wg"] Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.679480 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f86f8bdf5-h78wg"] Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.905159 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3a092e2-094c-41a2-add8-41c02dbeb9db" path="/var/lib/kubelet/pods/c3a092e2-094c-41a2-add8-41c02dbeb9db/volumes" Feb 20 16:51:12 crc kubenswrapper[4697]: I0220 16:51:12.905795 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 20 16:51:13 crc kubenswrapper[4697]: I0220 16:51:13.586077 4697 generic.go:334] "Generic (PLEG): container finished" podID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerID="b8890a01ec64cc0fbff8d602794046a752a097b8f0c492fe4df76dd9e68fe7b7" exitCode=143 Feb 20 16:51:13 crc kubenswrapper[4697]: I0220 16:51:13.587092 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c","Type":"ContainerDied","Data":"b8890a01ec64cc0fbff8d602794046a752a097b8f0c492fe4df76dd9e68fe7b7"} Feb 20 16:51:14 crc kubenswrapper[4697]: I0220 16:51:14.623818 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"62c15b1d-702d-4d85-93d3-c949ae8e421e","Type":"ContainerStarted","Data":"f88372720cc99174a4231b685ba7e0d75000786f451ddc1a7d06d9bf265a62d1"} Feb 20 16:51:14 crc kubenswrapper[4697]: I0220 16:51:14.627115 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-679c794b85-nbtsc" event={"ID":"38688b72-7aff-4efe-988f-aa147e5865e2","Type":"ContainerStarted","Data":"4eb7122a0a2b0d83cbee73f1b44364376d2a055ee0743433625411cb8984f051"} Feb 20 16:51:15 crc kubenswrapper[4697]: I0220 16:51:15.360219 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": read tcp 10.217.0.2:38010->10.217.0.155:9322: read: connection reset by peer" Feb 20 16:51:15 crc kubenswrapper[4697]: I0220 16:51:15.360986 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": dial tcp 10.217.0.155:9322: connect: connection refused" Feb 20 16:51:15 crc kubenswrapper[4697]: I0220 16:51:15.642911 4697 generic.go:334] "Generic (PLEG): container finished" podID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerID="6c2e9df137e85663d7cfd8f310d671bd6722b52bf17c85f0f108a061f532e053" exitCode=0 Feb 20 16:51:15 crc kubenswrapper[4697]: I0220 16:51:15.642971 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c","Type":"ContainerDied","Data":"6c2e9df137e85663d7cfd8f310d671bd6722b52bf17c85f0f108a061f532e053"} Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.655949 4697 generic.go:334] "Generic (PLEG): container finished" podID="581f9b28-e6b4-41c2-ab2b-d3e9392615f2" containerID="2792f05136401783af589827b871e78e478603352525ff27d3ed9941cd730bfd" exitCode=0 Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.655987 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7pf4" event={"ID":"581f9b28-e6b4-41c2-ab2b-d3e9392615f2","Type":"ContainerDied","Data":"2792f05136401783af589827b871e78e478603352525ff27d3ed9941cd730bfd"} Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.765685 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-ffc9c47b9-vz9sg"] Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.793948 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7bf9cb54f8-bfxkg"] Feb 20 16:51:16 crc kubenswrapper[4697]: E0220 16:51:16.794338 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a092e2-094c-41a2-add8-41c02dbeb9db" containerName="init" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.794353 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a092e2-094c-41a2-add8-41c02dbeb9db" containerName="init" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.794556 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3a092e2-094c-41a2-add8-41c02dbeb9db" containerName="init" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.795468 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.801554 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.813190 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bf9cb54f8-bfxkg"] Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.848124 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6667b0d-626d-4578-9767-2026d21b1583-config-data\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.848198 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6667b0d-626d-4578-9767-2026d21b1583-logs\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.848237 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-combined-ca-bundle\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.848297 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6667b0d-626d-4578-9767-2026d21b1583-scripts\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.848376 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2chfk\" (UniqueName: \"kubernetes.io/projected/d6667b0d-626d-4578-9767-2026d21b1583-kube-api-access-2chfk\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.848617 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-horizon-tls-certs\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.848696 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-horizon-secret-key\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.879781 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-679c794b85-nbtsc"] Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.928014 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6bc54df884-mx794"] Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.932351 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.951288 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6667b0d-626d-4578-9767-2026d21b1583-config-data\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.951379 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/215f7a56-10a1-4ae5-9071-4983dbb45b35-config-data\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.951419 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6667b0d-626d-4578-9767-2026d21b1583-logs\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.951478 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-combined-ca-bundle\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.953822 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6667b0d-626d-4578-9767-2026d21b1583-config-data\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.954306 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6667b0d-626d-4578-9767-2026d21b1583-logs\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.954529 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/215f7a56-10a1-4ae5-9071-4983dbb45b35-horizon-tls-certs\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.954855 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6667b0d-626d-4578-9767-2026d21b1583-scripts\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.954995 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215f7a56-10a1-4ae5-9071-4983dbb45b35-combined-ca-bundle\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.955198 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2chfk\" (UniqueName: \"kubernetes.io/projected/d6667b0d-626d-4578-9767-2026d21b1583-kube-api-access-2chfk\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.955465 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-horizon-tls-certs\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.955559 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-horizon-secret-key\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.955636 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/215f7a56-10a1-4ae5-9071-4983dbb45b35-logs\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.955716 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/215f7a56-10a1-4ae5-9071-4983dbb45b35-scripts\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.955795 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/215f7a56-10a1-4ae5-9071-4983dbb45b35-horizon-secret-key\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.955957 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rsfr\" (UniqueName: \"kubernetes.io/projected/215f7a56-10a1-4ae5-9071-4983dbb45b35-kube-api-access-7rsfr\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:16 crc kubenswrapper[4697]: I0220 16:51:16.955476 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6667b0d-626d-4578-9767-2026d21b1583-scripts\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:16.996580 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-horizon-secret-key\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:16.997340 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-combined-ca-bundle\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:16.997697 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-horizon-tls-certs\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.000846 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bc54df884-mx794"] Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.012931 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2chfk\" (UniqueName: \"kubernetes.io/projected/d6667b0d-626d-4578-9767-2026d21b1583-kube-api-access-2chfk\") pod \"horizon-7bf9cb54f8-bfxkg\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.061313 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/215f7a56-10a1-4ae5-9071-4983dbb45b35-logs\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.061370 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/215f7a56-10a1-4ae5-9071-4983dbb45b35-scripts\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.061393 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/215f7a56-10a1-4ae5-9071-4983dbb45b35-horizon-secret-key\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.061458 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rsfr\" (UniqueName: \"kubernetes.io/projected/215f7a56-10a1-4ae5-9071-4983dbb45b35-kube-api-access-7rsfr\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.061488 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/215f7a56-10a1-4ae5-9071-4983dbb45b35-config-data\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.061533 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/215f7a56-10a1-4ae5-9071-4983dbb45b35-horizon-tls-certs\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.061564 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215f7a56-10a1-4ae5-9071-4983dbb45b35-combined-ca-bundle\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.062233 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/215f7a56-10a1-4ae5-9071-4983dbb45b35-logs\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.062938 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/215f7a56-10a1-4ae5-9071-4983dbb45b35-scripts\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.063283 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/215f7a56-10a1-4ae5-9071-4983dbb45b35-config-data\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.066924 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/215f7a56-10a1-4ae5-9071-4983dbb45b35-horizon-secret-key\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.067548 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215f7a56-10a1-4ae5-9071-4983dbb45b35-combined-ca-bundle\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.068815 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/215f7a56-10a1-4ae5-9071-4983dbb45b35-horizon-tls-certs\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.079999 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rsfr\" (UniqueName: \"kubernetes.io/projected/215f7a56-10a1-4ae5-9071-4983dbb45b35-kube-api-access-7rsfr\") pod \"horizon-6bc54df884-mx794\" (UID: \"215f7a56-10a1-4ae5-9071-4983dbb45b35\") " pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.122230 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.125396 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:17 crc kubenswrapper[4697]: I0220 16:51:17.906856 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": dial tcp 10.217.0.155:9322: connect: connection refused" Feb 20 16:51:26 crc kubenswrapper[4697]: E0220 16:51:26.615086 4697 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.38:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 20 16:51:26 crc kubenswrapper[4697]: E0220 16:51:26.615588 4697 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.38:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 20 16:51:26 crc kubenswrapper[4697]: E0220 16:51:26.615740 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.38:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfbh5b9h4h566h58fh546hc8h578h654h5dchch565h656h5d8h665h5cfh57h579h5fbh5cfhfbh64fh5bfhcdh655h567h95hb5h4h578h9dh589q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdmmd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-95ff577cc-sl4rs_openstack(dfa5c6d4-0e61-4a76-8154-275204545c7b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 16:51:26 crc kubenswrapper[4697]: E0220 16:51:26.619557 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.38:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-95ff577cc-sl4rs" podUID="dfa5c6d4-0e61-4a76-8154-275204545c7b" Feb 20 16:51:26 crc kubenswrapper[4697]: E0220 16:51:26.620035 4697 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.38:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 20 16:51:26 crc kubenswrapper[4697]: E0220 16:51:26.620201 4697 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.38:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 20 16:51:26 crc kubenswrapper[4697]: E0220 16:51:26.620401 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.38:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bbh654hcbh68bhddh6bh649hfbh57fhc8hd7h64dh569h5b6h64hbh669h546h66hd7hb8h9h54ch665hdchc4h9dhd7h64fh645h554h557q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn4q4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-ffc9c47b9-vz9sg_openstack(a1083350-cbcc-4cf4-a927-a1b63029ffbe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 16:51:26 crc kubenswrapper[4697]: E0220 16:51:26.622907 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.38:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-ffc9c47b9-vz9sg" podUID="a1083350-cbcc-4cf4-a927-a1b63029ffbe" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.696390 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.765961 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7pf4" event={"ID":"581f9b28-e6b4-41c2-ab2b-d3e9392615f2","Type":"ContainerDied","Data":"c9b2e5d50c61651af8f3593b90988f6b71e2ee323b711b816a3b603120017a0c"} Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.766000 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7pf4" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.766021 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9b2e5d50c61651af8f3593b90988f6b71e2ee323b711b816a3b603120017a0c" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.893083 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-scripts\") pod \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.893135 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nqkv\" (UniqueName: \"kubernetes.io/projected/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-kube-api-access-5nqkv\") pod \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.893196 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-credential-keys\") pod \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.893266 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-fernet-keys\") pod \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.893336 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-combined-ca-bundle\") pod \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.893376 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-config-data\") pod \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\" (UID: \"581f9b28-e6b4-41c2-ab2b-d3e9392615f2\") " Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.900120 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-kube-api-access-5nqkv" (OuterVolumeSpecName: "kube-api-access-5nqkv") pod "581f9b28-e6b4-41c2-ab2b-d3e9392615f2" (UID: "581f9b28-e6b4-41c2-ab2b-d3e9392615f2"). InnerVolumeSpecName "kube-api-access-5nqkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.913720 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "581f9b28-e6b4-41c2-ab2b-d3e9392615f2" (UID: "581f9b28-e6b4-41c2-ab2b-d3e9392615f2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.913926 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "581f9b28-e6b4-41c2-ab2b-d3e9392615f2" (UID: "581f9b28-e6b4-41c2-ab2b-d3e9392615f2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.915136 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-scripts" (OuterVolumeSpecName: "scripts") pod "581f9b28-e6b4-41c2-ab2b-d3e9392615f2" (UID: "581f9b28-e6b4-41c2-ab2b-d3e9392615f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.923644 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-config-data" (OuterVolumeSpecName: "config-data") pod "581f9b28-e6b4-41c2-ab2b-d3e9392615f2" (UID: "581f9b28-e6b4-41c2-ab2b-d3e9392615f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.926270 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "581f9b28-e6b4-41c2-ab2b-d3e9392615f2" (UID: "581f9b28-e6b4-41c2-ab2b-d3e9392615f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.996492 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.996531 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nqkv\" (UniqueName: \"kubernetes.io/projected/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-kube-api-access-5nqkv\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.996550 4697 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.996563 4697 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.996575 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:26 crc kubenswrapper[4697]: I0220 16:51:26.996586 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/581f9b28-e6b4-41c2-ab2b-d3e9392615f2-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.789146 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-c7pf4"] Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.797049 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-c7pf4"] Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.899741 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dz6jg"] Feb 20 16:51:27 crc kubenswrapper[4697]: E0220 16:51:27.900193 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="581f9b28-e6b4-41c2-ab2b-d3e9392615f2" containerName="keystone-bootstrap" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.900214 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="581f9b28-e6b4-41c2-ab2b-d3e9392615f2" containerName="keystone-bootstrap" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.900402 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="581f9b28-e6b4-41c2-ab2b-d3e9392615f2" containerName="keystone-bootstrap" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.901111 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.902957 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.906492 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.910847 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.911186 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.911711 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.911736 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nd79" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.919721 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dz6jg"] Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.935531 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-combined-ca-bundle\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.935616 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbfpp\" (UniqueName: \"kubernetes.io/projected/36fa3e27-befa-408d-ac73-377e04786963-kube-api-access-mbfpp\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.935759 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-scripts\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.935825 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-fernet-keys\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.936064 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-credential-keys\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:27 crc kubenswrapper[4697]: I0220 16:51:27.936127 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-config-data\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.037968 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-credential-keys\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.039219 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-config-data\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.039644 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-combined-ca-bundle\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.039822 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbfpp\" (UniqueName: \"kubernetes.io/projected/36fa3e27-befa-408d-ac73-377e04786963-kube-api-access-mbfpp\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.039998 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-scripts\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.040089 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-fernet-keys\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.043902 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-scripts\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.044207 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-combined-ca-bundle\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.044798 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-config-data\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.045373 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-credential-keys\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.053824 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-fernet-keys\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.056056 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbfpp\" (UniqueName: \"kubernetes.io/projected/36fa3e27-befa-408d-ac73-377e04786963-kube-api-access-mbfpp\") pod \"keystone-bootstrap-dz6jg\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.251772 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:28 crc kubenswrapper[4697]: I0220 16:51:28.887803 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="581f9b28-e6b4-41c2-ab2b-d3e9392615f2" path="/var/lib/kubelet/pods/581f9b28-e6b4-41c2-ab2b-d3e9392615f2/volumes" Feb 20 16:51:30 crc kubenswrapper[4697]: I0220 16:51:30.799403 4697 generic.go:334] "Generic (PLEG): container finished" podID="5959da8c-890a-4acb-9781-71b7c9fb33a5" containerID="1787c7c15684669833a35fc56d74a3c79b5bff5b74c598b121e187fb68c5191d" exitCode=0 Feb 20 16:51:30 crc kubenswrapper[4697]: I0220 16:51:30.799493 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k8pk7" event={"ID":"5959da8c-890a-4acb-9781-71b7c9fb33a5","Type":"ContainerDied","Data":"1787c7c15684669833a35fc56d74a3c79b5bff5b74c598b121e187fb68c5191d"} Feb 20 16:51:32 crc kubenswrapper[4697]: I0220 16:51:32.907124 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 16:51:34 crc kubenswrapper[4697]: E0220 16:51:34.672649 4697 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.38:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Feb 20 16:51:34 crc kubenswrapper[4697]: E0220 16:51:34.672962 4697 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.38:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Feb 20 16:51:34 crc kubenswrapper[4697]: E0220 16:51:34.673076 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.102.83.38:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v5f9j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-hjgnj_openstack(39238d29-1c84-4169-b011-455bd2e7f000): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 16:51:34 crc kubenswrapper[4697]: E0220 16:51:34.674877 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-hjgnj" podUID="39238d29-1c84-4169-b011-455bd2e7f000" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.829325 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.833359 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c","Type":"ContainerDied","Data":"a699bc8ae830021a9daaa37a8951b22b9fce83cf70a26647539393eb718427c5"} Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.833388 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.833397 4697 scope.go:117] "RemoveContainer" containerID="6c2e9df137e85663d7cfd8f310d671bd6722b52bf17c85f0f108a061f532e053" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.834391 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.834799 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95ff577cc-sl4rs" event={"ID":"dfa5c6d4-0e61-4a76-8154-275204545c7b","Type":"ContainerDied","Data":"86aae0a2593b8caabca2a207a546f404b4daa9a06b08b5ebde8e2adaecd8a61b"} Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.834836 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86aae0a2593b8caabca2a207a546f404b4daa9a06b08b5ebde8e2adaecd8a61b" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.836635 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ffc9c47b9-vz9sg" event={"ID":"a1083350-cbcc-4cf4-a927-a1b63029ffbe","Type":"ContainerDied","Data":"7f949df40f10fd2fdd291867f87d15a03bd8b7f07e2c805f4784f70a7ab0da9c"} Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.836759 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ffc9c47b9-vz9sg" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.841504 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k8pk7" event={"ID":"5959da8c-890a-4acb-9781-71b7c9fb33a5","Type":"ContainerDied","Data":"4e259e69a16913e6246d8feb28c4b01bb4b9d46e33d79bc0dfc5a5b22dd8a3d3"} Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.841541 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e259e69a16913e6246d8feb28c4b01bb4b9d46e33d79bc0dfc5a5b22dd8a3d3" Feb 20 16:51:34 crc kubenswrapper[4697]: E0220 16:51:34.842058 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.38:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-hjgnj" podUID="39238d29-1c84-4169-b011-455bd2e7f000" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.842844 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.849798 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.959993 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-custom-prometheus-ca\") pod \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960195 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdmmd\" (UniqueName: \"kubernetes.io/projected/dfa5c6d4-0e61-4a76-8154-275204545c7b-kube-api-access-fdmmd\") pod \"dfa5c6d4-0e61-4a76-8154-275204545c7b\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960255 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dfa5c6d4-0e61-4a76-8154-275204545c7b-horizon-secret-key\") pod \"dfa5c6d4-0e61-4a76-8154-275204545c7b\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960297 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfa5c6d4-0e61-4a76-8154-275204545c7b-logs\") pod \"dfa5c6d4-0e61-4a76-8154-275204545c7b\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960333 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxd2h\" (UniqueName: \"kubernetes.io/projected/5959da8c-890a-4acb-9781-71b7c9fb33a5-kube-api-access-sxd2h\") pod \"5959da8c-890a-4acb-9781-71b7c9fb33a5\" (UID: \"5959da8c-890a-4acb-9781-71b7c9fb33a5\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960404 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfa5c6d4-0e61-4a76-8154-275204545c7b-scripts\") pod \"dfa5c6d4-0e61-4a76-8154-275204545c7b\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960454 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7mp4\" (UniqueName: \"kubernetes.io/projected/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-kube-api-access-w7mp4\") pod \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960485 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1083350-cbcc-4cf4-a927-a1b63029ffbe-horizon-secret-key\") pod \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960539 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1083350-cbcc-4cf4-a927-a1b63029ffbe-logs\") pod \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960569 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-combined-ca-bundle\") pod \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960593 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfa5c6d4-0e61-4a76-8154-275204545c7b-config-data\") pod \"dfa5c6d4-0e61-4a76-8154-275204545c7b\" (UID: \"dfa5c6d4-0e61-4a76-8154-275204545c7b\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960632 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5959da8c-890a-4acb-9781-71b7c9fb33a5-config\") pod \"5959da8c-890a-4acb-9781-71b7c9fb33a5\" (UID: \"5959da8c-890a-4acb-9781-71b7c9fb33a5\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960667 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-logs\") pod \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960692 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn4q4\" (UniqueName: \"kubernetes.io/projected/a1083350-cbcc-4cf4-a927-a1b63029ffbe-kube-api-access-vn4q4\") pod \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960718 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-config-data\") pod \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\" (UID: \"9b1e8265-b4c7-4f31-a5d6-2af38bd5871c\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960745 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1083350-cbcc-4cf4-a927-a1b63029ffbe-config-data\") pod \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960774 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1083350-cbcc-4cf4-a927-a1b63029ffbe-scripts\") pod \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\" (UID: \"a1083350-cbcc-4cf4-a927-a1b63029ffbe\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960806 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5959da8c-890a-4acb-9781-71b7c9fb33a5-combined-ca-bundle\") pod \"5959da8c-890a-4acb-9781-71b7c9fb33a5\" (UID: \"5959da8c-890a-4acb-9781-71b7c9fb33a5\") " Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.960811 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa5c6d4-0e61-4a76-8154-275204545c7b-logs" (OuterVolumeSpecName: "logs") pod "dfa5c6d4-0e61-4a76-8154-275204545c7b" (UID: "dfa5c6d4-0e61-4a76-8154-275204545c7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.961311 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfa5c6d4-0e61-4a76-8154-275204545c7b-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.961708 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1083350-cbcc-4cf4-a927-a1b63029ffbe-logs" (OuterVolumeSpecName: "logs") pod "a1083350-cbcc-4cf4-a927-a1b63029ffbe" (UID: "a1083350-cbcc-4cf4-a927-a1b63029ffbe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.962154 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa5c6d4-0e61-4a76-8154-275204545c7b-config-data" (OuterVolumeSpecName: "config-data") pod "dfa5c6d4-0e61-4a76-8154-275204545c7b" (UID: "dfa5c6d4-0e61-4a76-8154-275204545c7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.962281 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-logs" (OuterVolumeSpecName: "logs") pod "9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" (UID: "9b1e8265-b4c7-4f31-a5d6-2af38bd5871c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.966158 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1083350-cbcc-4cf4-a927-a1b63029ffbe-scripts" (OuterVolumeSpecName: "scripts") pod "a1083350-cbcc-4cf4-a927-a1b63029ffbe" (UID: "a1083350-cbcc-4cf4-a927-a1b63029ffbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.967199 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1083350-cbcc-4cf4-a927-a1b63029ffbe-config-data" (OuterVolumeSpecName: "config-data") pod "a1083350-cbcc-4cf4-a927-a1b63029ffbe" (UID: "a1083350-cbcc-4cf4-a927-a1b63029ffbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.967840 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa5c6d4-0e61-4a76-8154-275204545c7b-scripts" (OuterVolumeSpecName: "scripts") pod "dfa5c6d4-0e61-4a76-8154-275204545c7b" (UID: "dfa5c6d4-0e61-4a76-8154-275204545c7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.970962 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5959da8c-890a-4acb-9781-71b7c9fb33a5-kube-api-access-sxd2h" (OuterVolumeSpecName: "kube-api-access-sxd2h") pod "5959da8c-890a-4acb-9781-71b7c9fb33a5" (UID: "5959da8c-890a-4acb-9781-71b7c9fb33a5"). InnerVolumeSpecName "kube-api-access-sxd2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.973550 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1083350-cbcc-4cf4-a927-a1b63029ffbe-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a1083350-cbcc-4cf4-a927-a1b63029ffbe" (UID: "a1083350-cbcc-4cf4-a927-a1b63029ffbe"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.984986 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfa5c6d4-0e61-4a76-8154-275204545c7b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dfa5c6d4-0e61-4a76-8154-275204545c7b" (UID: "dfa5c6d4-0e61-4a76-8154-275204545c7b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.986926 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1083350-cbcc-4cf4-a927-a1b63029ffbe-kube-api-access-vn4q4" (OuterVolumeSpecName: "kube-api-access-vn4q4") pod "a1083350-cbcc-4cf4-a927-a1b63029ffbe" (UID: "a1083350-cbcc-4cf4-a927-a1b63029ffbe"). InnerVolumeSpecName "kube-api-access-vn4q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.987065 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa5c6d4-0e61-4a76-8154-275204545c7b-kube-api-access-fdmmd" (OuterVolumeSpecName: "kube-api-access-fdmmd") pod "dfa5c6d4-0e61-4a76-8154-275204545c7b" (UID: "dfa5c6d4-0e61-4a76-8154-275204545c7b"). InnerVolumeSpecName "kube-api-access-fdmmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.987472 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-kube-api-access-w7mp4" (OuterVolumeSpecName: "kube-api-access-w7mp4") pod "9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" (UID: "9b1e8265-b4c7-4f31-a5d6-2af38bd5871c"). InnerVolumeSpecName "kube-api-access-w7mp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.989970 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5959da8c-890a-4acb-9781-71b7c9fb33a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5959da8c-890a-4acb-9781-71b7c9fb33a5" (UID: "5959da8c-890a-4acb-9781-71b7c9fb33a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:34 crc kubenswrapper[4697]: I0220 16:51:34.995928 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5959da8c-890a-4acb-9781-71b7c9fb33a5-config" (OuterVolumeSpecName: "config") pod "5959da8c-890a-4acb-9781-71b7c9fb33a5" (UID: "5959da8c-890a-4acb-9781-71b7c9fb33a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.001680 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" (UID: "9b1e8265-b4c7-4f31-a5d6-2af38bd5871c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.005198 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" (UID: "9b1e8265-b4c7-4f31-a5d6-2af38bd5871c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.027676 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-config-data" (OuterVolumeSpecName: "config-data") pod "9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" (UID: "9b1e8265-b4c7-4f31-a5d6-2af38bd5871c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062507 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdmmd\" (UniqueName: \"kubernetes.io/projected/dfa5c6d4-0e61-4a76-8154-275204545c7b-kube-api-access-fdmmd\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062538 4697 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dfa5c6d4-0e61-4a76-8154-275204545c7b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062548 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxd2h\" (UniqueName: \"kubernetes.io/projected/5959da8c-890a-4acb-9781-71b7c9fb33a5-kube-api-access-sxd2h\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062560 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfa5c6d4-0e61-4a76-8154-275204545c7b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062568 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7mp4\" (UniqueName: \"kubernetes.io/projected/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-kube-api-access-w7mp4\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062577 4697 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1083350-cbcc-4cf4-a927-a1b63029ffbe-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062585 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1083350-cbcc-4cf4-a927-a1b63029ffbe-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062594 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062603 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfa5c6d4-0e61-4a76-8154-275204545c7b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062611 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5959da8c-890a-4acb-9781-71b7c9fb33a5-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062619 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062628 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn4q4\" (UniqueName: \"kubernetes.io/projected/a1083350-cbcc-4cf4-a927-a1b63029ffbe-kube-api-access-vn4q4\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062635 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062643 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1083350-cbcc-4cf4-a927-a1b63029ffbe-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062651 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1083350-cbcc-4cf4-a927-a1b63029ffbe-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062659 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5959da8c-890a-4acb-9781-71b7c9fb33a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.062666 4697 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.159581 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.174100 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.181043 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:51:35 crc kubenswrapper[4697]: E0220 16:51:35.181677 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api-log" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.181786 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api-log" Feb 20 16:51:35 crc kubenswrapper[4697]: E0220 16:51:35.181867 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.181953 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api" Feb 20 16:51:35 crc kubenswrapper[4697]: E0220 16:51:35.182007 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5959da8c-890a-4acb-9781-71b7c9fb33a5" containerName="neutron-db-sync" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.182059 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="5959da8c-890a-4acb-9781-71b7c9fb33a5" containerName="neutron-db-sync" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.182419 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="5959da8c-890a-4acb-9781-71b7c9fb33a5" containerName="neutron-db-sync" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.182513 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api-log" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.182570 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.183620 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.185731 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.207967 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.223370 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-ffc9c47b9-vz9sg"] Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.230947 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-ffc9c47b9-vz9sg"] Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.368984 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.369273 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-config-data\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.369485 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7t9g\" (UniqueName: \"kubernetes.io/projected/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-kube-api-access-d7t9g\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.369552 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.369607 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-logs\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.470949 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7t9g\" (UniqueName: \"kubernetes.io/projected/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-kube-api-access-d7t9g\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.471003 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.471028 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-logs\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.471081 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.471157 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-config-data\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.472195 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-logs\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.474120 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.476095 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.476478 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-config-data\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.504569 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7t9g\" (UniqueName: \"kubernetes.io/projected/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-kube-api-access-d7t9g\") pod \"watcher-api-0\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.511610 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.854081 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k8pk7" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.854087 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95ff577cc-sl4rs" Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.928222 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-95ff577cc-sl4rs"] Feb 20 16:51:35 crc kubenswrapper[4697]: I0220 16:51:35.940074 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-95ff577cc-sl4rs"] Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.109951 4697 scope.go:117] "RemoveContainer" containerID="b8890a01ec64cc0fbff8d602794046a752a097b8f0c492fe4df76dd9e68fe7b7" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.303365 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc676dfcf-zstfx"] Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.376009 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ddbff5fc9-jlmsc"] Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.377719 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.399323 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-dns-svc\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.401263 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-ovsdbserver-nb\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.402205 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-ovsdbserver-sb\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.402935 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-dns-swift-storage-0\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.403245 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78hp4\" (UniqueName: \"kubernetes.io/projected/21ea8664-e35f-40a2-9f86-391033cfc7bd-kube-api-access-78hp4\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.403392 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-config\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.431569 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ddbff5fc9-jlmsc"] Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.473585 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f4865d6c6-nk2sf"] Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.475206 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.511488 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zwpk7" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.511741 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.511979 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.515713 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-config\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.516868 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-dns-svc\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.518087 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-ovndb-tls-certs\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.518230 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-ovsdbserver-nb\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.518341 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-ovsdbserver-sb\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.518466 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-httpd-config\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.518569 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-dns-swift-storage-0\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.518719 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78hp4\" (UniqueName: \"kubernetes.io/projected/21ea8664-e35f-40a2-9f86-391033cfc7bd-kube-api-access-78hp4\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.518832 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-combined-ca-bundle\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.518927 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2ncf\" (UniqueName: \"kubernetes.io/projected/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-kube-api-access-j2ncf\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.519016 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-config\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.523020 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f4865d6c6-nk2sf"] Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.516724 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.517945 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-dns-svc\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.524712 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-dns-swift-storage-0\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.524796 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-ovsdbserver-nb\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.525030 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-config\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.525137 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-ovsdbserver-sb\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.600402 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78hp4\" (UniqueName: \"kubernetes.io/projected/21ea8664-e35f-40a2-9f86-391033cfc7bd-kube-api-access-78hp4\") pod \"dnsmasq-dns-6ddbff5fc9-jlmsc\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.620742 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-httpd-config\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.620840 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-combined-ca-bundle\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.620862 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2ncf\" (UniqueName: \"kubernetes.io/projected/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-kube-api-access-j2ncf\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.620916 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-config\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.620994 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-ovndb-tls-certs\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.636535 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-httpd-config\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.639200 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-config\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.639831 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-combined-ca-bundle\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.641310 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-ovndb-tls-certs\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.694664 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2ncf\" (UniqueName: \"kubernetes.io/projected/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-kube-api-access-j2ncf\") pod \"neutron-f4865d6c6-nk2sf\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.705890 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.754683 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bf9cb54f8-bfxkg"] Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.792527 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.869082 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bf9cb54f8-bfxkg" event={"ID":"d6667b0d-626d-4578-9767-2026d21b1583","Type":"ContainerStarted","Data":"bb23523d88ffd28cdc578147a74321a5dda15c4bb2063b7208d669888250ee20"} Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.892270 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" path="/var/lib/kubelet/pods/9b1e8265-b4c7-4f31-a5d6-2af38bd5871c/volumes" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.893143 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1083350-cbcc-4cf4-a927-a1b63029ffbe" path="/var/lib/kubelet/pods/a1083350-cbcc-4cf4-a927-a1b63029ffbe/volumes" Feb 20 16:51:36 crc kubenswrapper[4697]: I0220 16:51:36.897932 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfa5c6d4-0e61-4a76-8154-275204545c7b" path="/var/lib/kubelet/pods/dfa5c6d4-0e61-4a76-8154-275204545c7b/volumes" Feb 20 16:51:37 crc kubenswrapper[4697]: I0220 16:51:37.029236 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bc54df884-mx794"] Feb 20 16:51:37 crc kubenswrapper[4697]: I0220 16:51:37.202564 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dz6jg"] Feb 20 16:51:37 crc kubenswrapper[4697]: W0220 16:51:37.307777 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfed00e9_45d6_47dc_8a93_5895a8bc1f8f.slice/crio-ec81f4b7e8118af7a6197055f0e9dc477b78d595ffe5c5d303c4496620b57073 WatchSource:0}: Error finding container ec81f4b7e8118af7a6197055f0e9dc477b78d595ffe5c5d303c4496620b57073: Status 404 returned error can't find the container with id ec81f4b7e8118af7a6197055f0e9dc477b78d595ffe5c5d303c4496620b57073 Feb 20 16:51:37 crc kubenswrapper[4697]: I0220 16:51:37.308453 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:51:37 crc kubenswrapper[4697]: I0220 16:51:37.313850 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 16:51:37 crc kubenswrapper[4697]: I0220 16:51:37.772757 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f4865d6c6-nk2sf"] Feb 20 16:51:37 crc kubenswrapper[4697]: I0220 16:51:37.785851 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ddbff5fc9-jlmsc"] Feb 20 16:51:37 crc kubenswrapper[4697]: W0220 16:51:37.792617 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21ea8664_e35f_40a2_9f86_391033cfc7bd.slice/crio-a83caae4db568755ba5b8dc9f3f5d92e52443c42f4f646003d46392d5e4949d2 WatchSource:0}: Error finding container a83caae4db568755ba5b8dc9f3f5d92e52443c42f4f646003d46392d5e4949d2: Status 404 returned error can't find the container with id a83caae4db568755ba5b8dc9f3f5d92e52443c42f4f646003d46392d5e4949d2 Feb 20 16:51:37 crc kubenswrapper[4697]: I0220 16:51:37.901371 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f4865d6c6-nk2sf" event={"ID":"101ffeb3-55ef-4fbc-a42f-6532e6ff220e","Type":"ContainerStarted","Data":"69904eb3eb07634d33593dc165c5dd5a0fe5aa3d1f169390135bbe9ec9bfe8f5"} Feb 20 16:51:37 crc kubenswrapper[4697]: I0220 16:51:37.907605 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f","Type":"ContainerStarted","Data":"ec81f4b7e8118af7a6197055f0e9dc477b78d595ffe5c5d303c4496620b57073"} Feb 20 16:51:37 crc kubenswrapper[4697]: I0220 16:51:37.907744 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9b1e8265-b4c7-4f31-a5d6-2af38bd5871c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": dial tcp 10.217.0.155:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 20 16:51:37 crc kubenswrapper[4697]: I0220 16:51:37.913357 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" event={"ID":"21ea8664-e35f-40a2-9f86-391033cfc7bd","Type":"ContainerStarted","Data":"a83caae4db568755ba5b8dc9f3f5d92e52443c42f4f646003d46392d5e4949d2"} Feb 20 16:51:37 crc kubenswrapper[4697]: I0220 16:51:37.916627 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dz6jg" event={"ID":"36fa3e27-befa-408d-ac73-377e04786963","Type":"ContainerStarted","Data":"62a3756852d10033b25d41a30fc73cd1602101e232e736e1424d58d655f39a3d"} Feb 20 16:51:37 crc kubenswrapper[4697]: I0220 16:51:37.919203 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bc54df884-mx794" event={"ID":"215f7a56-10a1-4ae5-9071-4983dbb45b35","Type":"ContainerStarted","Data":"4c01c76434be1c6dc19779151f52cb257668a323befb51663d7ed160788f0fbe"} Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.766592 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57ffcc5f55-zw4vb"] Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.768991 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.773098 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.773521 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.781673 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57ffcc5f55-zw4vb"] Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.889852 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-ovndb-tls-certs\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.890110 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-combined-ca-bundle\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.890228 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-public-tls-certs\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.890355 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-internal-tls-certs\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.890428 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-httpd-config\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.890528 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jvsl\" (UniqueName: \"kubernetes.io/projected/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-kube-api-access-5jvsl\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.890791 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-config\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.992950 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-ovndb-tls-certs\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.992995 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-combined-ca-bundle\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.993021 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-public-tls-certs\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.993051 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-internal-tls-certs\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.993067 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-httpd-config\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.993096 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jvsl\" (UniqueName: \"kubernetes.io/projected/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-kube-api-access-5jvsl\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:38 crc kubenswrapper[4697]: I0220 16:51:38.993155 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-config\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:39 crc kubenswrapper[4697]: I0220 16:51:38.999959 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-internal-tls-certs\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:39 crc kubenswrapper[4697]: I0220 16:51:39.001023 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-config\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:39 crc kubenswrapper[4697]: I0220 16:51:39.001672 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-public-tls-certs\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:39 crc kubenswrapper[4697]: I0220 16:51:39.002197 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-ovndb-tls-certs\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:39 crc kubenswrapper[4697]: I0220 16:51:39.003336 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-combined-ca-bundle\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:39 crc kubenswrapper[4697]: I0220 16:51:39.006557 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-httpd-config\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:39 crc kubenswrapper[4697]: I0220 16:51:39.018369 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jvsl\" (UniqueName: \"kubernetes.io/projected/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-kube-api-access-5jvsl\") pod \"neutron-57ffcc5f55-zw4vb\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:39 crc kubenswrapper[4697]: I0220 16:51:39.137359 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:39 crc kubenswrapper[4697]: I0220 16:51:39.699652 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57ffcc5f55-zw4vb"] Feb 20 16:51:39 crc kubenswrapper[4697]: I0220 16:51:39.938962 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57ffcc5f55-zw4vb" event={"ID":"b3577c39-e7f5-4def-97f1-8e68e4bc07dd","Type":"ContainerStarted","Data":"e8d65aa10484deba9debd0e779071014a8420b3cc17646732b3392604ebcda18"} Feb 20 16:51:42 crc kubenswrapper[4697]: I0220 16:51:42.967745 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" event={"ID":"44731724-9f5b-4193-82c9-4233d91bac74","Type":"ContainerStarted","Data":"84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968"} Feb 20 16:51:43 crc kubenswrapper[4697]: E0220 16:51:43.345536 4697 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.38:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Feb 20 16:51:43 crc kubenswrapper[4697]: E0220 16:51:43.345592 4697 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.38:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Feb 20 16:51:43 crc kubenswrapper[4697]: E0220 16:51:43.346079 4697 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.38:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g885n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-dzcml_openstack(4bc8e40c-20d6-41e9-9f4e-25112a77e115): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 16:51:43 crc kubenswrapper[4697]: E0220 16:51:43.347396 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-dzcml" podUID="4bc8e40c-20d6-41e9-9f4e-25112a77e115" Feb 20 16:51:43 crc kubenswrapper[4697]: I0220 16:51:43.991143 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57ffcc5f55-zw4vb" event={"ID":"b3577c39-e7f5-4def-97f1-8e68e4bc07dd","Type":"ContainerStarted","Data":"898d85761d702f5464f68eb16f900f5a0d514f25a305e59f68a2d3aa7df28a68"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.018673 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aea295ce-e05d-46a9-9a39-94ffc4b29826","Type":"ContainerStarted","Data":"5831b527e2e4be06390860f6b38fedefa29a4d8e84e38d5eca3a5e9fb5b2db28"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.031029 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dz6jg" event={"ID":"36fa3e27-befa-408d-ac73-377e04786963","Type":"ContainerStarted","Data":"b4f4b6c7eceae72eaf36f93145369ac145650bbb3d018f5decf6183a22506971"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.041933 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=12.259862709 podStartE2EDuration="37.041911127s" podCreationTimestamp="2026-02-20 16:51:07 +0000 UTC" firstStartedPulling="2026-02-20 16:51:09.910952202 +0000 UTC m=+1177.690997600" lastFinishedPulling="2026-02-20 16:51:34.69300061 +0000 UTC m=+1202.473046018" observedRunningTime="2026-02-20 16:51:44.03310423 +0000 UTC m=+1211.813149638" watchObservedRunningTime="2026-02-20 16:51:44.041911127 +0000 UTC m=+1211.821956535" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.060349 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f7e5f6d-970f-4773-8a37-bb831984fc44","Type":"ContainerStarted","Data":"f1637e5b0b55c8e906489a5de7f78deeb49a5a9ca42b65d7563ef908e139eb9d"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.060533 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6f7e5f6d-970f-4773-8a37-bb831984fc44" containerName="glance-log" containerID="cri-o://1ca32382fd227cc2c6e15cab981256bde06320af92c5c67127b470fe5c3a577e" gracePeriod=30 Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.060839 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6f7e5f6d-970f-4773-8a37-bb831984fc44" containerName="glance-httpd" containerID="cri-o://f1637e5b0b55c8e906489a5de7f78deeb49a5a9ca42b65d7563ef908e139eb9d" gracePeriod=30 Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.062185 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dz6jg" podStartSLOduration=17.062172944 podStartE2EDuration="17.062172944s" podCreationTimestamp="2026-02-20 16:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:44.052769733 +0000 UTC m=+1211.832815141" watchObservedRunningTime="2026-02-20 16:51:44.062172944 +0000 UTC m=+1211.842218352" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.113048 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f4865d6c6-nk2sf" event={"ID":"101ffeb3-55ef-4fbc-a42f-6532e6ff220e","Type":"ContainerStarted","Data":"aebb766dec87808ba3cfa76453cedb421eb8d23cfe826110da5123e01b0ab6e1"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.113100 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f4865d6c6-nk2sf" event={"ID":"101ffeb3-55ef-4fbc-a42f-6532e6ff220e","Type":"ContainerStarted","Data":"942e3fd4d8e91fd32f78f7db69871651847f38705f7bbfaa09ba841bce8c8ce4"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.114255 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.170026 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=36.170005732 podStartE2EDuration="36.170005732s" podCreationTimestamp="2026-02-20 16:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:44.129250341 +0000 UTC m=+1211.909295739" watchObservedRunningTime="2026-02-20 16:51:44.170005732 +0000 UTC m=+1211.950051140" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.170662 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"62c15b1d-702d-4d85-93d3-c949ae8e421e","Type":"ContainerStarted","Data":"c44f2a943e56fc65ef0cb37678c341d6724197ce21271b237fc9ca867c7b43c8"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.170864 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="62c15b1d-702d-4d85-93d3-c949ae8e421e" containerName="glance-log" containerID="cri-o://f88372720cc99174a4231b685ba7e0d75000786f451ddc1a7d06d9bf265a62d1" gracePeriod=30 Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.171395 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="62c15b1d-702d-4d85-93d3-c949ae8e421e" containerName="glance-httpd" containerID="cri-o://c44f2a943e56fc65ef0cb37678c341d6724197ce21271b237fc9ca867c7b43c8" gracePeriod=30 Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.194902 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f4865d6c6-nk2sf" podStartSLOduration=8.194885643 podStartE2EDuration="8.194885643s" podCreationTimestamp="2026-02-20 16:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:44.169624543 +0000 UTC m=+1211.949669951" watchObservedRunningTime="2026-02-20 16:51:44.194885643 +0000 UTC m=+1211.974931051" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.206358 4697 generic.go:334] "Generic (PLEG): container finished" podID="21ea8664-e35f-40a2-9f86-391033cfc7bd" containerID="3173ad1cceb9dfcdf17d2bb161110a9af4927e082a45a35d2d9cd40241c69c84" exitCode=0 Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.207523 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" event={"ID":"21ea8664-e35f-40a2-9f86-391033cfc7bd","Type":"ContainerDied","Data":"3173ad1cceb9dfcdf17d2bb161110a9af4927e082a45a35d2d9cd40241c69c84"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.210847 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=36.210829915 podStartE2EDuration="36.210829915s" podCreationTimestamp="2026-02-20 16:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:44.206085248 +0000 UTC m=+1211.986130656" watchObservedRunningTime="2026-02-20 16:51:44.210829915 +0000 UTC m=+1211.990875323" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.233770 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lv84x" event={"ID":"d1aba0bf-7e2c-4a14-894d-d1247f7356eb","Type":"ContainerStarted","Data":"5e7053b3a6eaabbc80d3c16eb184fb5761b6d49350a579ad316eebd23ea2b330"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.260915 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bc54df884-mx794" event={"ID":"215f7a56-10a1-4ae5-9071-4983dbb45b35","Type":"ContainerStarted","Data":"015f6bafbdc46dcaec02f7e0e73ce79f56321f3fcffed228acfe73ce87a3a572"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.267179 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-lv84x" podStartSLOduration=11.709295319 podStartE2EDuration="37.267138448s" podCreationTimestamp="2026-02-20 16:51:07 +0000 UTC" firstStartedPulling="2026-02-20 16:51:10.483793269 +0000 UTC m=+1178.263838677" lastFinishedPulling="2026-02-20 16:51:36.041636398 +0000 UTC m=+1203.821681806" observedRunningTime="2026-02-20 16:51:44.262016092 +0000 UTC m=+1212.042061500" watchObservedRunningTime="2026-02-20 16:51:44.267138448 +0000 UTC m=+1212.047183856" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.294045 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-679c794b85-nbtsc" event={"ID":"38688b72-7aff-4efe-988f-aa147e5865e2","Type":"ContainerStarted","Data":"5c904e098eff1f50369dbb1b6e8a680e4219dcb312235c69580d11495a620313"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.294198 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-679c794b85-nbtsc" podUID="38688b72-7aff-4efe-988f-aa147e5865e2" containerName="horizon-log" containerID="cri-o://5c904e098eff1f50369dbb1b6e8a680e4219dcb312235c69580d11495a620313" gracePeriod=30 Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.294490 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-679c794b85-nbtsc" podUID="38688b72-7aff-4efe-988f-aa147e5865e2" containerName="horizon" containerID="cri-o://db9c2ca261106d6c4e7feae41bab1d47736f748ad72e69b7693169273a9162f7" gracePeriod=30 Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.321531 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"8abdcc99-bf45-4ca2-82b4-147b0a707333","Type":"ContainerStarted","Data":"08f5a3c9502f8c0be5b1bb7c859e0acca1977fb8674d7f257dbc7a4c8d3eef43"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.327454 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6bc54df884-mx794" podStartSLOduration=28.327421718 podStartE2EDuration="28.327421718s" podCreationTimestamp="2026-02-20 16:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:44.321114033 +0000 UTC m=+1212.101159441" watchObservedRunningTime="2026-02-20 16:51:44.327421718 +0000 UTC m=+1212.107467126" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.330524 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bf9cb54f8-bfxkg" event={"ID":"d6667b0d-626d-4578-9767-2026d21b1583","Type":"ContainerStarted","Data":"6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.333643 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f","Type":"ContainerStarted","Data":"31d7bab99a76b21dcde31195ee96704ee92e69d68ad7f7f423f41d161a7ad9ed"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.333680 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f","Type":"ContainerStarted","Data":"9f8983e72087ad930ae19fdb48bd9e133d258ec0aee90328c0027fc97ab70b01"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.340258 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.360618 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b988e030-0e85-401e-bd22-057f9c2de43d","Type":"ContainerStarted","Data":"deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04"} Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.360776 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" podUID="44731724-9f5b-4193-82c9-4233d91bac74" containerName="dnsmasq-dns" containerID="cri-o://84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968" gracePeriod=10 Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.360812 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:44 crc kubenswrapper[4697]: E0220 16:51:44.363129 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.38:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-dzcml" podUID="4bc8e40c-20d6-41e9-9f4e-25112a77e115" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.371216 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-679c794b85-nbtsc" podStartSLOduration=12.256029604 podStartE2EDuration="34.371198003s" podCreationTimestamp="2026-02-20 16:51:10 +0000 UTC" firstStartedPulling="2026-02-20 16:51:13.93995125 +0000 UTC m=+1181.719996658" lastFinishedPulling="2026-02-20 16:51:36.055119649 +0000 UTC m=+1203.835165057" observedRunningTime="2026-02-20 16:51:44.370831254 +0000 UTC m=+1212.150876662" watchObservedRunningTime="2026-02-20 16:51:44.371198003 +0000 UTC m=+1212.151243411" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.412425 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" podStartSLOduration=36.412409615 podStartE2EDuration="36.412409615s" podCreationTimestamp="2026-02-20 16:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:44.407642758 +0000 UTC m=+1212.187688166" watchObservedRunningTime="2026-02-20 16:51:44.412409615 +0000 UTC m=+1212.192455023" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.488036 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7bf9cb54f8-bfxkg" podStartSLOduration=28.488017962 podStartE2EDuration="28.488017962s" podCreationTimestamp="2026-02-20 16:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:44.438368482 +0000 UTC m=+1212.218413890" watchObservedRunningTime="2026-02-20 16:51:44.488017962 +0000 UTC m=+1212.268063390" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.518448 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=12.764757457 podStartE2EDuration="37.518411438s" podCreationTimestamp="2026-02-20 16:51:07 +0000 UTC" firstStartedPulling="2026-02-20 16:51:09.901812398 +0000 UTC m=+1177.681857806" lastFinishedPulling="2026-02-20 16:51:34.655466379 +0000 UTC m=+1202.435511787" observedRunningTime="2026-02-20 16:51:44.46555055 +0000 UTC m=+1212.245595958" watchObservedRunningTime="2026-02-20 16:51:44.518411438 +0000 UTC m=+1212.298456846" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.532810 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=9.532793351 podStartE2EDuration="9.532793351s" podCreationTimestamp="2026-02-20 16:51:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:44.482038325 +0000 UTC m=+1212.262083733" watchObservedRunningTime="2026-02-20 16:51:44.532793351 +0000 UTC m=+1212.312838759" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.873886 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.928687 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-config\") pod \"44731724-9f5b-4193-82c9-4233d91bac74\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.928755 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbgs2\" (UniqueName: \"kubernetes.io/projected/44731724-9f5b-4193-82c9-4233d91bac74-kube-api-access-tbgs2\") pod \"44731724-9f5b-4193-82c9-4233d91bac74\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.928789 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-dns-svc\") pod \"44731724-9f5b-4193-82c9-4233d91bac74\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.928831 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-dns-swift-storage-0\") pod \"44731724-9f5b-4193-82c9-4233d91bac74\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.929003 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-ovsdbserver-sb\") pod \"44731724-9f5b-4193-82c9-4233d91bac74\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.929026 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-ovsdbserver-nb\") pod \"44731724-9f5b-4193-82c9-4233d91bac74\" (UID: \"44731724-9f5b-4193-82c9-4233d91bac74\") " Feb 20 16:51:44 crc kubenswrapper[4697]: I0220 16:51:44.938750 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44731724-9f5b-4193-82c9-4233d91bac74-kube-api-access-tbgs2" (OuterVolumeSpecName: "kube-api-access-tbgs2") pod "44731724-9f5b-4193-82c9-4233d91bac74" (UID: "44731724-9f5b-4193-82c9-4233d91bac74"). InnerVolumeSpecName "kube-api-access-tbgs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.016616 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "44731724-9f5b-4193-82c9-4233d91bac74" (UID: "44731724-9f5b-4193-82c9-4233d91bac74"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.031038 4697 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.031069 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbgs2\" (UniqueName: \"kubernetes.io/projected/44731724-9f5b-4193-82c9-4233d91bac74-kube-api-access-tbgs2\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.033314 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-config" (OuterVolumeSpecName: "config") pod "44731724-9f5b-4193-82c9-4233d91bac74" (UID: "44731724-9f5b-4193-82c9-4233d91bac74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.051095 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "44731724-9f5b-4193-82c9-4233d91bac74" (UID: "44731724-9f5b-4193-82c9-4233d91bac74"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.051305 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "44731724-9f5b-4193-82c9-4233d91bac74" (UID: "44731724-9f5b-4193-82c9-4233d91bac74"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.107523 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "44731724-9f5b-4193-82c9-4233d91bac74" (UID: "44731724-9f5b-4193-82c9-4233d91bac74"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.133660 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.133705 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.133718 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.133730 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44731724-9f5b-4193-82c9-4233d91bac74-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.388813 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" event={"ID":"21ea8664-e35f-40a2-9f86-391033cfc7bd","Type":"ContainerStarted","Data":"628bafb9542b89c830ba13bcb4462d7a2f72ae2dc774d358da8bdcae5ac6c56f"} Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.389578 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.401779 4697 generic.go:334] "Generic (PLEG): container finished" podID="44731724-9f5b-4193-82c9-4233d91bac74" containerID="84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968" exitCode=0 Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.401925 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" event={"ID":"44731724-9f5b-4193-82c9-4233d91bac74","Type":"ContainerDied","Data":"84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968"} Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.401980 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" event={"ID":"44731724-9f5b-4193-82c9-4233d91bac74","Type":"ContainerDied","Data":"128c542707cb566330e27c0560bddb65a8272d4fcd486b7db631fd5647d26507"} Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.402004 4697 scope.go:117] "RemoveContainer" containerID="84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.402221 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc676dfcf-zstfx" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.420780 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bc54df884-mx794" event={"ID":"215f7a56-10a1-4ae5-9071-4983dbb45b35","Type":"ContainerStarted","Data":"bbdb06e1ab1967ef0b35bead6c1a836ad1af58dca410fe431ccaeb3dc9cae4a7"} Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.421989 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" podStartSLOduration=9.421979566 podStartE2EDuration="9.421979566s" podCreationTimestamp="2026-02-20 16:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:45.406597568 +0000 UTC m=+1213.186642976" watchObservedRunningTime="2026-02-20 16:51:45.421979566 +0000 UTC m=+1213.202024974" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.434360 4697 generic.go:334] "Generic (PLEG): container finished" podID="6f7e5f6d-970f-4773-8a37-bb831984fc44" containerID="f1637e5b0b55c8e906489a5de7f78deeb49a5a9ca42b65d7563ef908e139eb9d" exitCode=0 Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.434389 4697 generic.go:334] "Generic (PLEG): container finished" podID="6f7e5f6d-970f-4773-8a37-bb831984fc44" containerID="1ca32382fd227cc2c6e15cab981256bde06320af92c5c67127b470fe5c3a577e" exitCode=143 Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.434428 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f7e5f6d-970f-4773-8a37-bb831984fc44","Type":"ContainerDied","Data":"f1637e5b0b55c8e906489a5de7f78deeb49a5a9ca42b65d7563ef908e139eb9d"} Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.434517 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f7e5f6d-970f-4773-8a37-bb831984fc44","Type":"ContainerDied","Data":"1ca32382fd227cc2c6e15cab981256bde06320af92c5c67127b470fe5c3a577e"} Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.454498 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc676dfcf-zstfx"] Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.463993 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dc676dfcf-zstfx"] Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.468587 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57ffcc5f55-zw4vb" event={"ID":"b3577c39-e7f5-4def-97f1-8e68e4bc07dd","Type":"ContainerStarted","Data":"44f3535f183a3dc31ea58d908f7531be76c8b45705b172c9402ca3cbfc9e6204"} Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.468661 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.473615 4697 generic.go:334] "Generic (PLEG): container finished" podID="62c15b1d-702d-4d85-93d3-c949ae8e421e" containerID="c44f2a943e56fc65ef0cb37678c341d6724197ce21271b237fc9ca867c7b43c8" exitCode=0 Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.473641 4697 generic.go:334] "Generic (PLEG): container finished" podID="62c15b1d-702d-4d85-93d3-c949ae8e421e" containerID="f88372720cc99174a4231b685ba7e0d75000786f451ddc1a7d06d9bf265a62d1" exitCode=143 Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.473683 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"62c15b1d-702d-4d85-93d3-c949ae8e421e","Type":"ContainerDied","Data":"c44f2a943e56fc65ef0cb37678c341d6724197ce21271b237fc9ca867c7b43c8"} Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.473704 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"62c15b1d-702d-4d85-93d3-c949ae8e421e","Type":"ContainerDied","Data":"f88372720cc99174a4231b685ba7e0d75000786f451ddc1a7d06d9bf265a62d1"} Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.479568 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-679c794b85-nbtsc" event={"ID":"38688b72-7aff-4efe-988f-aa147e5865e2","Type":"ContainerStarted","Data":"db9c2ca261106d6c4e7feae41bab1d47736f748ad72e69b7693169273a9162f7"} Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.484753 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bf9cb54f8-bfxkg" event={"ID":"d6667b0d-626d-4578-9767-2026d21b1583","Type":"ContainerStarted","Data":"29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7"} Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.500852 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57ffcc5f55-zw4vb" podStartSLOduration=7.500835132 podStartE2EDuration="7.500835132s" podCreationTimestamp="2026-02-20 16:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:45.485552577 +0000 UTC m=+1213.265597995" watchObservedRunningTime="2026-02-20 16:51:45.500835132 +0000 UTC m=+1213.280880540" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.551711 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 20 16:51:45 crc kubenswrapper[4697]: I0220 16:51:45.552470 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.070932 4697 scope.go:117] "RemoveContainer" containerID="5d06133e9dc95c94283a23ded1a16c174e639b328ea08d89b993cb3e04a5ab75" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.254686 4697 scope.go:117] "RemoveContainer" containerID="84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968" Feb 20 16:51:46 crc kubenswrapper[4697]: E0220 16:51:46.255368 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968\": container with ID starting with 84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968 not found: ID does not exist" containerID="84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.255574 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968"} err="failed to get container status \"84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968\": rpc error: code = NotFound desc = could not find container \"84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968\": container with ID starting with 84525fbac1874b4b3393589d67548626bafdc026856364f9b8a65847266c9968 not found: ID does not exist" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.255682 4697 scope.go:117] "RemoveContainer" containerID="5d06133e9dc95c94283a23ded1a16c174e639b328ea08d89b993cb3e04a5ab75" Feb 20 16:51:46 crc kubenswrapper[4697]: E0220 16:51:46.255943 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d06133e9dc95c94283a23ded1a16c174e639b328ea08d89b993cb3e04a5ab75\": container with ID starting with 5d06133e9dc95c94283a23ded1a16c174e639b328ea08d89b993cb3e04a5ab75 not found: ID does not exist" containerID="5d06133e9dc95c94283a23ded1a16c174e639b328ea08d89b993cb3e04a5ab75" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.256034 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d06133e9dc95c94283a23ded1a16c174e639b328ea08d89b993cb3e04a5ab75"} err="failed to get container status \"5d06133e9dc95c94283a23ded1a16c174e639b328ea08d89b993cb3e04a5ab75\": rpc error: code = NotFound desc = could not find container \"5d06133e9dc95c94283a23ded1a16c174e639b328ea08d89b993cb3e04a5ab75\": container with ID starting with 5d06133e9dc95c94283a23ded1a16c174e639b328ea08d89b993cb3e04a5ab75 not found: ID does not exist" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.280255 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.292141 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.470298 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-internal-tls-certs\") pod \"62c15b1d-702d-4d85-93d3-c949ae8e421e\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.470812 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-combined-ca-bundle\") pod \"6f7e5f6d-970f-4773-8a37-bb831984fc44\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.470842 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7e5f6d-970f-4773-8a37-bb831984fc44-logs\") pod \"6f7e5f6d-970f-4773-8a37-bb831984fc44\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.470890 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62c15b1d-702d-4d85-93d3-c949ae8e421e-logs\") pod \"62c15b1d-702d-4d85-93d3-c949ae8e421e\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.470911 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-public-tls-certs\") pod \"6f7e5f6d-970f-4773-8a37-bb831984fc44\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471011 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-config-data\") pod \"62c15b1d-702d-4d85-93d3-c949ae8e421e\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471298 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-457qx\" (UniqueName: \"kubernetes.io/projected/62c15b1d-702d-4d85-93d3-c949ae8e421e-kube-api-access-457qx\") pod \"62c15b1d-702d-4d85-93d3-c949ae8e421e\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471332 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-combined-ca-bundle\") pod \"62c15b1d-702d-4d85-93d3-c949ae8e421e\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471371 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f7e5f6d-970f-4773-8a37-bb831984fc44-logs" (OuterVolumeSpecName: "logs") pod "6f7e5f6d-970f-4773-8a37-bb831984fc44" (UID: "6f7e5f6d-970f-4773-8a37-bb831984fc44"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471395 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62c15b1d-702d-4d85-93d3-c949ae8e421e-logs" (OuterVolumeSpecName: "logs") pod "62c15b1d-702d-4d85-93d3-c949ae8e421e" (UID: "62c15b1d-702d-4d85-93d3-c949ae8e421e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471739 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wghq\" (UniqueName: \"kubernetes.io/projected/6f7e5f6d-970f-4773-8a37-bb831984fc44-kube-api-access-4wghq\") pod \"6f7e5f6d-970f-4773-8a37-bb831984fc44\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471796 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/62c15b1d-702d-4d85-93d3-c949ae8e421e-httpd-run\") pod \"62c15b1d-702d-4d85-93d3-c949ae8e421e\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471836 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-scripts\") pod \"62c15b1d-702d-4d85-93d3-c949ae8e421e\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471867 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"6f7e5f6d-970f-4773-8a37-bb831984fc44\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471891 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f7e5f6d-970f-4773-8a37-bb831984fc44-httpd-run\") pod \"6f7e5f6d-970f-4773-8a37-bb831984fc44\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471920 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-config-data\") pod \"6f7e5f6d-970f-4773-8a37-bb831984fc44\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471936 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-scripts\") pod \"6f7e5f6d-970f-4773-8a37-bb831984fc44\" (UID: \"6f7e5f6d-970f-4773-8a37-bb831984fc44\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.471973 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"62c15b1d-702d-4d85-93d3-c949ae8e421e\" (UID: \"62c15b1d-702d-4d85-93d3-c949ae8e421e\") " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.472394 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7e5f6d-970f-4773-8a37-bb831984fc44-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.472406 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62c15b1d-702d-4d85-93d3-c949ae8e421e-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.473126 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62c15b1d-702d-4d85-93d3-c949ae8e421e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "62c15b1d-702d-4d85-93d3-c949ae8e421e" (UID: "62c15b1d-702d-4d85-93d3-c949ae8e421e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.481798 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f7e5f6d-970f-4773-8a37-bb831984fc44-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6f7e5f6d-970f-4773-8a37-bb831984fc44" (UID: "6f7e5f6d-970f-4773-8a37-bb831984fc44"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.487571 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "62c15b1d-702d-4d85-93d3-c949ae8e421e" (UID: "62c15b1d-702d-4d85-93d3-c949ae8e421e"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.487596 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-scripts" (OuterVolumeSpecName: "scripts") pod "62c15b1d-702d-4d85-93d3-c949ae8e421e" (UID: "62c15b1d-702d-4d85-93d3-c949ae8e421e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.487902 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f7e5f6d-970f-4773-8a37-bb831984fc44-kube-api-access-4wghq" (OuterVolumeSpecName: "kube-api-access-4wghq") pod "6f7e5f6d-970f-4773-8a37-bb831984fc44" (UID: "6f7e5f6d-970f-4773-8a37-bb831984fc44"). InnerVolumeSpecName "kube-api-access-4wghq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.492735 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-scripts" (OuterVolumeSpecName: "scripts") pod "6f7e5f6d-970f-4773-8a37-bb831984fc44" (UID: "6f7e5f6d-970f-4773-8a37-bb831984fc44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.496046 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"62c15b1d-702d-4d85-93d3-c949ae8e421e","Type":"ContainerDied","Data":"5265c1c686daae3d2de42e92d461d9552aa3529efbe6b83f290bc643e85bf2fa"} Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.496106 4697 scope.go:117] "RemoveContainer" containerID="c44f2a943e56fc65ef0cb37678c341d6724197ce21271b237fc9ca867c7b43c8" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.496236 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.499564 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "6f7e5f6d-970f-4773-8a37-bb831984fc44" (UID: "6f7e5f6d-970f-4773-8a37-bb831984fc44"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.509705 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c15b1d-702d-4d85-93d3-c949ae8e421e-kube-api-access-457qx" (OuterVolumeSpecName: "kube-api-access-457qx") pod "62c15b1d-702d-4d85-93d3-c949ae8e421e" (UID: "62c15b1d-702d-4d85-93d3-c949ae8e421e"). InnerVolumeSpecName "kube-api-access-457qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.515999 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b988e030-0e85-401e-bd22-057f9c2de43d","Type":"ContainerStarted","Data":"34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be"} Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.542584 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.543090 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f7e5f6d-970f-4773-8a37-bb831984fc44","Type":"ContainerDied","Data":"aea2fe477858d26298ef259e688d9a48b10625a09c4f9b4a755f3ea8eeada7a5"} Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.543506 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.544086 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f7e5f6d-970f-4773-8a37-bb831984fc44" (UID: "6f7e5f6d-970f-4773-8a37-bb831984fc44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.571548 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62c15b1d-702d-4d85-93d3-c949ae8e421e" (UID: "62c15b1d-702d-4d85-93d3-c949ae8e421e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.573815 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.576447 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-457qx\" (UniqueName: \"kubernetes.io/projected/62c15b1d-702d-4d85-93d3-c949ae8e421e-kube-api-access-457qx\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.576566 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.576633 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wghq\" (UniqueName: \"kubernetes.io/projected/6f7e5f6d-970f-4773-8a37-bb831984fc44-kube-api-access-4wghq\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.576688 4697 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/62c15b1d-702d-4d85-93d3-c949ae8e421e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.576745 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.576816 4697 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.576872 4697 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f7e5f6d-970f-4773-8a37-bb831984fc44-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.576927 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.576984 4697 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.586630 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-config-data" (OuterVolumeSpecName: "config-data") pod "6f7e5f6d-970f-4773-8a37-bb831984fc44" (UID: "6f7e5f6d-970f-4773-8a37-bb831984fc44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.593611 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.170:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.615573 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "62c15b1d-702d-4d85-93d3-c949ae8e421e" (UID: "62c15b1d-702d-4d85-93d3-c949ae8e421e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.617006 4697 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.638736 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-config-data" (OuterVolumeSpecName: "config-data") pod "62c15b1d-702d-4d85-93d3-c949ae8e421e" (UID: "62c15b1d-702d-4d85-93d3-c949ae8e421e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.645868 4697 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.667020 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6f7e5f6d-970f-4773-8a37-bb831984fc44" (UID: "6f7e5f6d-970f-4773-8a37-bb831984fc44"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.679484 4697 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.679771 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.679843 4697 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.679906 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7e5f6d-970f-4773-8a37-bb831984fc44-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.680034 4697 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.680094 4697 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62c15b1d-702d-4d85-93d3-c949ae8e421e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.746330 4697 scope.go:117] "RemoveContainer" containerID="f88372720cc99174a4231b685ba7e0d75000786f451ddc1a7d06d9bf265a62d1" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.791810 4697 scope.go:117] "RemoveContainer" containerID="f1637e5b0b55c8e906489a5de7f78deeb49a5a9ca42b65d7563ef908e139eb9d" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.833675 4697 scope.go:117] "RemoveContainer" containerID="1ca32382fd227cc2c6e15cab981256bde06320af92c5c67127b470fe5c3a577e" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.836491 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.875848 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.943179 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44731724-9f5b-4193-82c9-4233d91bac74" path="/var/lib/kubelet/pods/44731724-9f5b-4193-82c9-4233d91bac74/volumes" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.943801 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c15b1d-702d-4d85-93d3-c949ae8e421e" path="/var/lib/kubelet/pods/62c15b1d-702d-4d85-93d3-c949ae8e421e/volumes" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.945820 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:51:46 crc kubenswrapper[4697]: E0220 16:51:46.946103 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44731724-9f5b-4193-82c9-4233d91bac74" containerName="dnsmasq-dns" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.946114 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="44731724-9f5b-4193-82c9-4233d91bac74" containerName="dnsmasq-dns" Feb 20 16:51:46 crc kubenswrapper[4697]: E0220 16:51:46.946130 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7e5f6d-970f-4773-8a37-bb831984fc44" containerName="glance-log" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.946136 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7e5f6d-970f-4773-8a37-bb831984fc44" containerName="glance-log" Feb 20 16:51:46 crc kubenswrapper[4697]: E0220 16:51:46.946145 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c15b1d-702d-4d85-93d3-c949ae8e421e" containerName="glance-log" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.946150 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c15b1d-702d-4d85-93d3-c949ae8e421e" containerName="glance-log" Feb 20 16:51:46 crc kubenswrapper[4697]: E0220 16:51:46.946173 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c15b1d-702d-4d85-93d3-c949ae8e421e" containerName="glance-httpd" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.946179 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c15b1d-702d-4d85-93d3-c949ae8e421e" containerName="glance-httpd" Feb 20 16:51:46 crc kubenswrapper[4697]: E0220 16:51:46.946198 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44731724-9f5b-4193-82c9-4233d91bac74" containerName="init" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.946203 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="44731724-9f5b-4193-82c9-4233d91bac74" containerName="init" Feb 20 16:51:46 crc kubenswrapper[4697]: E0220 16:51:46.946213 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7e5f6d-970f-4773-8a37-bb831984fc44" containerName="glance-httpd" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.946218 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7e5f6d-970f-4773-8a37-bb831984fc44" containerName="glance-httpd" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.946370 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c15b1d-702d-4d85-93d3-c949ae8e421e" containerName="glance-log" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.946385 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f7e5f6d-970f-4773-8a37-bb831984fc44" containerName="glance-log" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.946394 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c15b1d-702d-4d85-93d3-c949ae8e421e" containerName="glance-httpd" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.946411 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f7e5f6d-970f-4773-8a37-bb831984fc44" containerName="glance-httpd" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.946419 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="44731724-9f5b-4193-82c9-4233d91bac74" containerName="dnsmasq-dns" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.947328 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.965247 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.965532 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.965720 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.965962 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qzcv5" Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.972876 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:51:46 crc kubenswrapper[4697]: I0220 16:51:46.993550 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.002867 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.013502 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.015052 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.027687 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.030681 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.033709 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.083966 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.088789 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.088854 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.088877 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.088921 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-logs\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.088944 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.088966 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.088991 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn2rm\" (UniqueName: \"kubernetes.io/projected/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-kube-api-access-kn2rm\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.089018 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.122547 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.122620 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.125696 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.125751 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.191848 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.191903 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db97975b-6aae-4903-830f-13bfdd9a47b3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.191924 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dl5f\" (UniqueName: \"kubernetes.io/projected/db97975b-6aae-4903-830f-13bfdd9a47b3-kube-api-access-2dl5f\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.191979 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.191999 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192043 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db97975b-6aae-4903-830f-13bfdd9a47b3-logs\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192072 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192117 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192135 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192153 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192194 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-logs\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192208 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-scripts\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192232 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192251 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-config-data\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192266 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192290 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn2rm\" (UniqueName: \"kubernetes.io/projected/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-kube-api-access-kn2rm\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192721 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.192863 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.194047 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-logs\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.200283 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.202659 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.206358 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.211118 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.222266 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn2rm\" (UniqueName: \"kubernetes.io/projected/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-kube-api-access-kn2rm\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.228147 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.294231 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-scripts\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.294901 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-config-data\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.295032 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db97975b-6aae-4903-830f-13bfdd9a47b3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.295099 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dl5f\" (UniqueName: \"kubernetes.io/projected/db97975b-6aae-4903-830f-13bfdd9a47b3-kube-api-access-2dl5f\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.295170 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.295242 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.295335 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db97975b-6aae-4903-830f-13bfdd9a47b3-logs\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.295480 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.295674 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.298694 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.299136 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db97975b-6aae-4903-830f-13bfdd9a47b3-logs\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.302647 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db97975b-6aae-4903-830f-13bfdd9a47b3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.309271 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-config-data\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.319372 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-scripts\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.322938 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.322966 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.327277 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dl5f\" (UniqueName: \"kubernetes.io/projected/db97975b-6aae-4903-830f-13bfdd9a47b3-kube-api-access-2dl5f\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.365458 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.569856 4697 generic.go:334] "Generic (PLEG): container finished" podID="d1aba0bf-7e2c-4a14-894d-d1247f7356eb" containerID="5e7053b3a6eaabbc80d3c16eb184fb5761b6d49350a579ad316eebd23ea2b330" exitCode=0 Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.570141 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lv84x" event={"ID":"d1aba0bf-7e2c-4a14-894d-d1247f7356eb","Type":"ContainerDied","Data":"5e7053b3a6eaabbc80d3c16eb184fb5761b6d49350a579ad316eebd23ea2b330"} Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.649663 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.925715 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.926061 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.974356 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 20 16:51:47 crc kubenswrapper[4697]: I0220 16:51:47.999318 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.050530 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.053948 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.296777 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:51:48 crc kubenswrapper[4697]: W0220 16:51:48.304027 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb97975b_6aae_4903_830f_13bfdd9a47b3.slice/crio-9731608616d9ba8a33fa6a0d6098ea044a689220112743d8115ef5cc35a3b080 WatchSource:0}: Error finding container 9731608616d9ba8a33fa6a0d6098ea044a689220112743d8115ef5cc35a3b080: Status 404 returned error can't find the container with id 9731608616d9ba8a33fa6a0d6098ea044a689220112743d8115ef5cc35a3b080 Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.590082 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hjgnj" event={"ID":"39238d29-1c84-4169-b011-455bd2e7f000","Type":"ContainerStarted","Data":"27bc6ecc4912db01e3f0bc28e595316ae27904ad3409f54a97d4a9cec5f1a769"} Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.592729 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c","Type":"ContainerStarted","Data":"ba05bda7b682ff71d79b2ac3861a6d217cdddc22ea8e7020461aca63096e1f61"} Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.609219 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"db97975b-6aae-4903-830f-13bfdd9a47b3","Type":"ContainerStarted","Data":"9731608616d9ba8a33fa6a0d6098ea044a689220112743d8115ef5cc35a3b080"} Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.611378 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.623502 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-hjgnj" podStartSLOduration=4.887675044 podStartE2EDuration="41.623482774s" podCreationTimestamp="2026-02-20 16:51:07 +0000 UTC" firstStartedPulling="2026-02-20 16:51:10.382335788 +0000 UTC m=+1178.162381196" lastFinishedPulling="2026-02-20 16:51:47.118143518 +0000 UTC m=+1214.898188926" observedRunningTime="2026-02-20 16:51:48.617319673 +0000 UTC m=+1216.397365101" watchObservedRunningTime="2026-02-20 16:51:48.623482774 +0000 UTC m=+1216.403528192" Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.689665 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.742052 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.778778 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.810412 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:51:48 crc kubenswrapper[4697]: I0220 16:51:48.927461 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f7e5f6d-970f-4773-8a37-bb831984fc44" path="/var/lib/kubelet/pods/6f7e5f6d-970f-4773-8a37-bb831984fc44/volumes" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.034487 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.162588 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b22hw\" (UniqueName: \"kubernetes.io/projected/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-kube-api-access-b22hw\") pod \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.163013 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-scripts\") pod \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.163050 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-logs\") pod \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.163207 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-combined-ca-bundle\") pod \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.165014 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-logs" (OuterVolumeSpecName: "logs") pod "d1aba0bf-7e2c-4a14-894d-d1247f7356eb" (UID: "d1aba0bf-7e2c-4a14-894d-d1247f7356eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.165090 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-config-data\") pod \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\" (UID: \"d1aba0bf-7e2c-4a14-894d-d1247f7356eb\") " Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.165982 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.168327 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-scripts" (OuterVolumeSpecName: "scripts") pod "d1aba0bf-7e2c-4a14-894d-d1247f7356eb" (UID: "d1aba0bf-7e2c-4a14-894d-d1247f7356eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.169084 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-kube-api-access-b22hw" (OuterVolumeSpecName: "kube-api-access-b22hw") pod "d1aba0bf-7e2c-4a14-894d-d1247f7356eb" (UID: "d1aba0bf-7e2c-4a14-894d-d1247f7356eb"). InnerVolumeSpecName "kube-api-access-b22hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.219160 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-config-data" (OuterVolumeSpecName: "config-data") pod "d1aba0bf-7e2c-4a14-894d-d1247f7356eb" (UID: "d1aba0bf-7e2c-4a14-894d-d1247f7356eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.256575 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1aba0bf-7e2c-4a14-894d-d1247f7356eb" (UID: "d1aba0bf-7e2c-4a14-894d-d1247f7356eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.268088 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.268117 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.268144 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b22hw\" (UniqueName: \"kubernetes.io/projected/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-kube-api-access-b22hw\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.268157 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1aba0bf-7e2c-4a14-894d-d1247f7356eb-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.695372 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c","Type":"ContainerStarted","Data":"5da033a97a3e25310bf04bbbf86d1464185a251c43b5c741b61fd0b34e5d176b"} Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.707376 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b64457bc4-cnrrj"] Feb 20 16:51:49 crc kubenswrapper[4697]: E0220 16:51:49.707922 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1aba0bf-7e2c-4a14-894d-d1247f7356eb" containerName="placement-db-sync" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.707939 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1aba0bf-7e2c-4a14-894d-d1247f7356eb" containerName="placement-db-sync" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.708114 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1aba0bf-7e2c-4a14-894d-d1247f7356eb" containerName="placement-db-sync" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.710227 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.712648 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.713030 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.723538 4697 generic.go:334] "Generic (PLEG): container finished" podID="aea295ce-e05d-46a9-9a39-94ffc4b29826" containerID="5831b527e2e4be06390860f6b38fedefa29a4d8e84e38d5eca3a5e9fb5b2db28" exitCode=1 Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.723601 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aea295ce-e05d-46a9-9a39-94ffc4b29826","Type":"ContainerDied","Data":"5831b527e2e4be06390860f6b38fedefa29a4d8e84e38d5eca3a5e9fb5b2db28"} Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.724044 4697 scope.go:117] "RemoveContainer" containerID="5831b527e2e4be06390860f6b38fedefa29a4d8e84e38d5eca3a5e9fb5b2db28" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.735979 4697 generic.go:334] "Generic (PLEG): container finished" podID="36fa3e27-befa-408d-ac73-377e04786963" containerID="b4f4b6c7eceae72eaf36f93145369ac145650bbb3d018f5decf6183a22506971" exitCode=0 Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.736043 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dz6jg" event={"ID":"36fa3e27-befa-408d-ac73-377e04786963","Type":"ContainerDied","Data":"b4f4b6c7eceae72eaf36f93145369ac145650bbb3d018f5decf6183a22506971"} Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.773989 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b64457bc4-cnrrj"] Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.777065 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lv84x" event={"ID":"d1aba0bf-7e2c-4a14-894d-d1247f7356eb","Type":"ContainerDied","Data":"196a85827544d29dc5265edee8f288ca5cf52ce433e549303d14caaf230f5cb5"} Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.777106 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="196a85827544d29dc5265edee8f288ca5cf52ce433e549303d14caaf230f5cb5" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.777082 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lv84x" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.789679 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"db97975b-6aae-4903-830f-13bfdd9a47b3","Type":"ContainerStarted","Data":"f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142"} Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.802575 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9k8t\" (UniqueName: \"kubernetes.io/projected/15b0bf59-8d86-4210-a901-2841715a8487-kube-api-access-v9k8t\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.802903 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15b0bf59-8d86-4210-a901-2841715a8487-logs\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.803040 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-combined-ca-bundle\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.803174 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-scripts\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.803268 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-internal-tls-certs\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.803386 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-config-data\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.803502 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-public-tls-certs\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.916219 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-combined-ca-bundle\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.916299 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-scripts\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.916338 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-internal-tls-certs\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.916394 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-config-data\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.916420 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-public-tls-certs\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.916479 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9k8t\" (UniqueName: \"kubernetes.io/projected/15b0bf59-8d86-4210-a901-2841715a8487-kube-api-access-v9k8t\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.916497 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15b0bf59-8d86-4210-a901-2841715a8487-logs\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.923487 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-public-tls-certs\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.924025 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15b0bf59-8d86-4210-a901-2841715a8487-logs\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.924483 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-scripts\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.926076 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-internal-tls-certs\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.929026 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-combined-ca-bundle\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.929289 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-config-data\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:49 crc kubenswrapper[4697]: I0220 16:51:49.948401 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9k8t\" (UniqueName: \"kubernetes.io/projected/15b0bf59-8d86-4210-a901-2841715a8487-kube-api-access-v9k8t\") pod \"placement-b64457bc4-cnrrj\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:50 crc kubenswrapper[4697]: I0220 16:51:50.129473 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:50 crc kubenswrapper[4697]: I0220 16:51:50.662072 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b64457bc4-cnrrj"] Feb 20 16:51:50 crc kubenswrapper[4697]: I0220 16:51:50.812646 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b64457bc4-cnrrj" event={"ID":"15b0bf59-8d86-4210-a901-2841715a8487","Type":"ContainerStarted","Data":"595f7c98c54449c156d93e9f96d3e3378eca083413f2a8082daea62837242365"} Feb 20 16:51:50 crc kubenswrapper[4697]: I0220 16:51:50.817999 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c","Type":"ContainerStarted","Data":"3ac5779a2c08cfdf37177a0b0971baaa2d0dd5bc7ccf8aabcba210efda1da730"} Feb 20 16:51:50 crc kubenswrapper[4697]: I0220 16:51:50.829133 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aea295ce-e05d-46a9-9a39-94ffc4b29826","Type":"ContainerStarted","Data":"49cb9fb17195ab4325f654d4c9d330b48ea9e9c735b527b73e1728e5326f1b97"} Feb 20 16:51:50 crc kubenswrapper[4697]: I0220 16:51:50.829264 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="aea295ce-e05d-46a9-9a39-94ffc4b29826" containerName="watcher-decision-engine" containerID="cri-o://49cb9fb17195ab4325f654d4c9d330b48ea9e9c735b527b73e1728e5326f1b97" gracePeriod=30 Feb 20 16:51:50 crc kubenswrapper[4697]: I0220 16:51:50.831953 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="8abdcc99-bf45-4ca2-82b4-147b0a707333" containerName="watcher-applier" containerID="cri-o://08f5a3c9502f8c0be5b1bb7c859e0acca1977fb8674d7f257dbc7a4c8d3eef43" gracePeriod=30 Feb 20 16:51:50 crc kubenswrapper[4697]: I0220 16:51:50.832641 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"db97975b-6aae-4903-830f-13bfdd9a47b3","Type":"ContainerStarted","Data":"1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67"} Feb 20 16:51:50 crc kubenswrapper[4697]: I0220 16:51:50.845113 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.845099879 podStartE2EDuration="4.845099879s" podCreationTimestamp="2026-02-20 16:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:50.838149078 +0000 UTC m=+1218.618194486" watchObservedRunningTime="2026-02-20 16:51:50.845099879 +0000 UTC m=+1218.625145287" Feb 20 16:51:50 crc kubenswrapper[4697]: I0220 16:51:50.896155 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.896133682 podStartE2EDuration="4.896133682s" podCreationTimestamp="2026-02-20 16:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:50.885450019 +0000 UTC m=+1218.665495437" watchObservedRunningTime="2026-02-20 16:51:50.896133682 +0000 UTC m=+1218.676179090" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.012617 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.209529 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.367035 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-config-data\") pod \"36fa3e27-befa-408d-ac73-377e04786963\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.367623 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbfpp\" (UniqueName: \"kubernetes.io/projected/36fa3e27-befa-408d-ac73-377e04786963-kube-api-access-mbfpp\") pod \"36fa3e27-befa-408d-ac73-377e04786963\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.367642 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-credential-keys\") pod \"36fa3e27-befa-408d-ac73-377e04786963\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.367829 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-fernet-keys\") pod \"36fa3e27-befa-408d-ac73-377e04786963\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.367882 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-scripts\") pod \"36fa3e27-befa-408d-ac73-377e04786963\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.367931 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-combined-ca-bundle\") pod \"36fa3e27-befa-408d-ac73-377e04786963\" (UID: \"36fa3e27-befa-408d-ac73-377e04786963\") " Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.373036 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "36fa3e27-befa-408d-ac73-377e04786963" (UID: "36fa3e27-befa-408d-ac73-377e04786963"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.374624 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fa3e27-befa-408d-ac73-377e04786963-kube-api-access-mbfpp" (OuterVolumeSpecName: "kube-api-access-mbfpp") pod "36fa3e27-befa-408d-ac73-377e04786963" (UID: "36fa3e27-befa-408d-ac73-377e04786963"). InnerVolumeSpecName "kube-api-access-mbfpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.375065 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "36fa3e27-befa-408d-ac73-377e04786963" (UID: "36fa3e27-befa-408d-ac73-377e04786963"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.375201 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-scripts" (OuterVolumeSpecName: "scripts") pod "36fa3e27-befa-408d-ac73-377e04786963" (UID: "36fa3e27-befa-408d-ac73-377e04786963"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.403668 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-config-data" (OuterVolumeSpecName: "config-data") pod "36fa3e27-befa-408d-ac73-377e04786963" (UID: "36fa3e27-befa-408d-ac73-377e04786963"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.423268 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36fa3e27-befa-408d-ac73-377e04786963" (UID: "36fa3e27-befa-408d-ac73-377e04786963"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.469773 4697 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.469805 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.469813 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.469823 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.469831 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbfpp\" (UniqueName: \"kubernetes.io/projected/36fa3e27-befa-408d-ac73-377e04786963-kube-api-access-mbfpp\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.469840 4697 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/36fa3e27-befa-408d-ac73-377e04786963-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.709190 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.795996 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d9f898cc5-9k7vr"] Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.796524 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" podUID="8dc1f573-c5c3-436b-9c10-820add978d5b" containerName="dnsmasq-dns" containerID="cri-o://20536d476bffd66b144be093eb27eb0aa93f06d37191f8f856c532b9ec985e0c" gracePeriod=10 Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.859094 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b64457bc4-cnrrj" event={"ID":"15b0bf59-8d86-4210-a901-2841715a8487","Type":"ContainerStarted","Data":"accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a"} Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.859146 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b64457bc4-cnrrj" event={"ID":"15b0bf59-8d86-4210-a901-2841715a8487","Type":"ContainerStarted","Data":"c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279"} Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.859798 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.859826 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.869982 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dz6jg" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.870532 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dz6jg" event={"ID":"36fa3e27-befa-408d-ac73-377e04786963","Type":"ContainerDied","Data":"62a3756852d10033b25d41a30fc73cd1602101e232e736e1424d58d655f39a3d"} Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.870586 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62a3756852d10033b25d41a30fc73cd1602101e232e736e1424d58d655f39a3d" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.871611 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" podUID="8dc1f573-c5c3-436b-9c10-820add978d5b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.918997 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-85d8b4ccc6-tdklm"] Feb 20 16:51:51 crc kubenswrapper[4697]: E0220 16:51:51.919382 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fa3e27-befa-408d-ac73-377e04786963" containerName="keystone-bootstrap" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.919395 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fa3e27-befa-408d-ac73-377e04786963" containerName="keystone-bootstrap" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.919571 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fa3e27-befa-408d-ac73-377e04786963" containerName="keystone-bootstrap" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.920231 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.924232 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.925665 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.925862 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.925967 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nd79" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.926241 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.926353 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.933287 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-b64457bc4-cnrrj" podStartSLOduration=2.9332696 podStartE2EDuration="2.9332696s" podCreationTimestamp="2026-02-20 16:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:51:51.904867603 +0000 UTC m=+1219.684913011" watchObservedRunningTime="2026-02-20 16:51:51.9332696 +0000 UTC m=+1219.713315008" Feb 20 16:51:51 crc kubenswrapper[4697]: I0220 16:51:51.960298 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85d8b4ccc6-tdklm"] Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.126137 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-public-tls-certs\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.126241 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-internal-tls-certs\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.126289 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-credential-keys\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.126312 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-config-data\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.126351 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-combined-ca-bundle\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.126375 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-scripts\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.126406 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-fernet-keys\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.126428 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6wpv\" (UniqueName: \"kubernetes.io/projected/9e1ff974-0f05-484c-b267-f1537ec9495e-kube-api-access-l6wpv\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.228183 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6wpv\" (UniqueName: \"kubernetes.io/projected/9e1ff974-0f05-484c-b267-f1537ec9495e-kube-api-access-l6wpv\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.228257 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-public-tls-certs\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.228321 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-internal-tls-certs\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.228345 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-credential-keys\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.228364 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-config-data\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.228400 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-combined-ca-bundle\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.228423 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-scripts\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.228467 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-fernet-keys\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.235171 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-config-data\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.235866 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-internal-tls-certs\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.238000 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-credential-keys\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.238209 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-fernet-keys\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.238645 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-combined-ca-bundle\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.238731 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-scripts\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.242408 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e1ff974-0f05-484c-b267-f1537ec9495e-public-tls-certs\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.244026 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6wpv\" (UniqueName: \"kubernetes.io/projected/9e1ff974-0f05-484c-b267-f1537ec9495e-kube-api-access-l6wpv\") pod \"keystone-85d8b4ccc6-tdklm\" (UID: \"9e1ff974-0f05-484c-b267-f1537ec9495e\") " pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.303884 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.886419 4697 generic.go:334] "Generic (PLEG): container finished" podID="39238d29-1c84-4169-b011-455bd2e7f000" containerID="27bc6ecc4912db01e3f0bc28e595316ae27904ad3409f54a97d4a9cec5f1a769" exitCode=0 Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.886614 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hjgnj" event={"ID":"39238d29-1c84-4169-b011-455bd2e7f000","Type":"ContainerDied","Data":"27bc6ecc4912db01e3f0bc28e595316ae27904ad3409f54a97d4a9cec5f1a769"} Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.893237 4697 generic.go:334] "Generic (PLEG): container finished" podID="8dc1f573-c5c3-436b-9c10-820add978d5b" containerID="20536d476bffd66b144be093eb27eb0aa93f06d37191f8f856c532b9ec985e0c" exitCode=0 Feb 20 16:51:52 crc kubenswrapper[4697]: I0220 16:51:52.893377 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" event={"ID":"8dc1f573-c5c3-436b-9c10-820add978d5b","Type":"ContainerDied","Data":"20536d476bffd66b144be093eb27eb0aa93f06d37191f8f856c532b9ec985e0c"} Feb 20 16:51:52 crc kubenswrapper[4697]: E0220 16:51:52.925295 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="08f5a3c9502f8c0be5b1bb7c859e0acca1977fb8674d7f257dbc7a4c8d3eef43" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 20 16:51:52 crc kubenswrapper[4697]: E0220 16:51:52.927630 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="08f5a3c9502f8c0be5b1bb7c859e0acca1977fb8674d7f257dbc7a4c8d3eef43" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 20 16:51:52 crc kubenswrapper[4697]: E0220 16:51:52.930424 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="08f5a3c9502f8c0be5b1bb7c859e0acca1977fb8674d7f257dbc7a4c8d3eef43" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 20 16:51:52 crc kubenswrapper[4697]: E0220 16:51:52.930507 4697 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="8abdcc99-bf45-4ca2-82b4-147b0a707333" containerName="watcher-applier" Feb 20 16:51:53 crc kubenswrapper[4697]: I0220 16:51:53.902957 4697 generic.go:334] "Generic (PLEG): container finished" podID="8abdcc99-bf45-4ca2-82b4-147b0a707333" containerID="08f5a3c9502f8c0be5b1bb7c859e0acca1977fb8674d7f257dbc7a4c8d3eef43" exitCode=0 Feb 20 16:51:53 crc kubenswrapper[4697]: I0220 16:51:53.903012 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"8abdcc99-bf45-4ca2-82b4-147b0a707333","Type":"ContainerDied","Data":"08f5a3c9502f8c0be5b1bb7c859e0acca1977fb8674d7f257dbc7a4c8d3eef43"} Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.480718 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.504665 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5f9j\" (UniqueName: \"kubernetes.io/projected/39238d29-1c84-4169-b011-455bd2e7f000-kube-api-access-v5f9j\") pod \"39238d29-1c84-4169-b011-455bd2e7f000\" (UID: \"39238d29-1c84-4169-b011-455bd2e7f000\") " Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.504748 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/39238d29-1c84-4169-b011-455bd2e7f000-db-sync-config-data\") pod \"39238d29-1c84-4169-b011-455bd2e7f000\" (UID: \"39238d29-1c84-4169-b011-455bd2e7f000\") " Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.504860 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39238d29-1c84-4169-b011-455bd2e7f000-combined-ca-bundle\") pod \"39238d29-1c84-4169-b011-455bd2e7f000\" (UID: \"39238d29-1c84-4169-b011-455bd2e7f000\") " Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.538643 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39238d29-1c84-4169-b011-455bd2e7f000-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "39238d29-1c84-4169-b011-455bd2e7f000" (UID: "39238d29-1c84-4169-b011-455bd2e7f000"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.538804 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39238d29-1c84-4169-b011-455bd2e7f000-kube-api-access-v5f9j" (OuterVolumeSpecName: "kube-api-access-v5f9j") pod "39238d29-1c84-4169-b011-455bd2e7f000" (UID: "39238d29-1c84-4169-b011-455bd2e7f000"). InnerVolumeSpecName "kube-api-access-v5f9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.590409 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39238d29-1c84-4169-b011-455bd2e7f000-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39238d29-1c84-4169-b011-455bd2e7f000" (UID: "39238d29-1c84-4169-b011-455bd2e7f000"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.606289 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5f9j\" (UniqueName: \"kubernetes.io/projected/39238d29-1c84-4169-b011-455bd2e7f000-kube-api-access-v5f9j\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.606317 4697 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/39238d29-1c84-4169-b011-455bd2e7f000-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.606326 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39238d29-1c84-4169-b011-455bd2e7f000-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.911795 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hjgnj" event={"ID":"39238d29-1c84-4169-b011-455bd2e7f000","Type":"ContainerDied","Data":"add3a53fcaca8909dc0762cd481a186b768d21345db5267f4b314cc0050f23f4"} Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.912164 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="add3a53fcaca8909dc0762cd481a186b768d21345db5267f4b314cc0050f23f4" Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.911971 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hjgnj" Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.915490 4697 generic.go:334] "Generic (PLEG): container finished" podID="aea295ce-e05d-46a9-9a39-94ffc4b29826" containerID="49cb9fb17195ab4325f654d4c9d330b48ea9e9c735b527b73e1728e5326f1b97" exitCode=1 Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.915532 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aea295ce-e05d-46a9-9a39-94ffc4b29826","Type":"ContainerDied","Data":"49cb9fb17195ab4325f654d4c9d330b48ea9e9c735b527b73e1728e5326f1b97"} Feb 20 16:51:54 crc kubenswrapper[4697]: I0220 16:51:54.915566 4697 scope.go:117] "RemoveContainer" containerID="5831b527e2e4be06390860f6b38fedefa29a4d8e84e38d5eca3a5e9fb5b2db28" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.113172 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-8474b565df-7h82r"] Feb 20 16:51:55 crc kubenswrapper[4697]: E0220 16:51:55.113583 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39238d29-1c84-4169-b011-455bd2e7f000" containerName="barbican-db-sync" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.113599 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="39238d29-1c84-4169-b011-455bd2e7f000" containerName="barbican-db-sync" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.113813 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="39238d29-1c84-4169-b011-455bd2e7f000" containerName="barbican-db-sync" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.114749 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.121221 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2kwkr" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.121393 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.121464 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.178633 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-674dd9ffc6-spdfl"] Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.180158 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.185615 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.191562 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8474b565df-7h82r"] Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.218273 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-674dd9ffc6-spdfl"] Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.220374 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc44740-061d-4c3a-9164-735d6da2dcf7-config-data\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.222681 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dc44740-061d-4c3a-9164-735d6da2dcf7-config-data-custom\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.222731 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc44740-061d-4c3a-9164-735d6da2dcf7-combined-ca-bundle\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.222832 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc44740-061d-4c3a-9164-735d6da2dcf7-logs\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.222955 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgnq9\" (UniqueName: \"kubernetes.io/projected/9dc44740-061d-4c3a-9164-735d6da2dcf7-kube-api-access-vgnq9\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.262468 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58ccbc4c65-26lj7"] Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.270021 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.320484 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58ccbc4c65-26lj7"] Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.327015 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a69046-b0f7-4c26-a941-aba4a9475d0a-logs\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.327115 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgnq9\" (UniqueName: \"kubernetes.io/projected/9dc44740-061d-4c3a-9164-735d6da2dcf7-kube-api-access-vgnq9\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.327899 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a69046-b0f7-4c26-a941-aba4a9475d0a-config-data\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.327958 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrg8q\" (UniqueName: \"kubernetes.io/projected/71a69046-b0f7-4c26-a941-aba4a9475d0a-kube-api-access-mrg8q\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.328016 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc44740-061d-4c3a-9164-735d6da2dcf7-config-data\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.328070 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a69046-b0f7-4c26-a941-aba4a9475d0a-combined-ca-bundle\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.328097 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71a69046-b0f7-4c26-a941-aba4a9475d0a-config-data-custom\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.328138 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dc44740-061d-4c3a-9164-735d6da2dcf7-config-data-custom\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.328170 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc44740-061d-4c3a-9164-735d6da2dcf7-combined-ca-bundle\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.328228 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc44740-061d-4c3a-9164-735d6da2dcf7-logs\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.328730 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc44740-061d-4c3a-9164-735d6da2dcf7-logs\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.352413 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc44740-061d-4c3a-9164-735d6da2dcf7-combined-ca-bundle\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.353896 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc44740-061d-4c3a-9164-735d6da2dcf7-config-data\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.370479 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgnq9\" (UniqueName: \"kubernetes.io/projected/9dc44740-061d-4c3a-9164-735d6da2dcf7-kube-api-access-vgnq9\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.399129 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dc44740-061d-4c3a-9164-735d6da2dcf7-config-data-custom\") pod \"barbican-worker-8474b565df-7h82r\" (UID: \"9dc44740-061d-4c3a-9164-735d6da2dcf7\") " pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.433590 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-dns-svc\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.433904 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a69046-b0f7-4c26-a941-aba4a9475d0a-config-data\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.433994 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrg8q\" (UniqueName: \"kubernetes.io/projected/71a69046-b0f7-4c26-a941-aba4a9475d0a-kube-api-access-mrg8q\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.434074 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-config\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.434166 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a69046-b0f7-4c26-a941-aba4a9475d0a-combined-ca-bundle\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.434234 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71a69046-b0f7-4c26-a941-aba4a9475d0a-config-data-custom\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.434318 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-ovsdbserver-nb\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.434388 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-dns-swift-storage-0\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.434488 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a69046-b0f7-4c26-a941-aba4a9475d0a-logs\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.434578 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fmsh\" (UniqueName: \"kubernetes.io/projected/388a02e2-0a16-4e6b-9c80-f72cad62a370-kube-api-access-6fmsh\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.434729 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-ovsdbserver-sb\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.435782 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a69046-b0f7-4c26-a941-aba4a9475d0a-logs\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.437530 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5b958d6fb8-lff5x"] Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.439511 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.453099 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a69046-b0f7-4c26-a941-aba4a9475d0a-combined-ca-bundle\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.453558 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a69046-b0f7-4c26-a941-aba4a9475d0a-config-data\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.459163 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.462809 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71a69046-b0f7-4c26-a941-aba4a9475d0a-config-data-custom\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.466549 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b958d6fb8-lff5x"] Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.469365 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrg8q\" (UniqueName: \"kubernetes.io/projected/71a69046-b0f7-4c26-a941-aba4a9475d0a-kube-api-access-mrg8q\") pod \"barbican-keystone-listener-674dd9ffc6-spdfl\" (UID: \"71a69046-b0f7-4c26-a941-aba4a9475d0a\") " pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.515205 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8474b565df-7h82r" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.520127 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.535309 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.536271 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-config\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.536325 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhhzn\" (UniqueName: \"kubernetes.io/projected/74cab817-1ae2-4488-9a46-1377bf50d579-kube-api-access-nhhzn\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.536351 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74cab817-1ae2-4488-9a46-1377bf50d579-logs\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.536401 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-combined-ca-bundle\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.536447 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-ovsdbserver-nb\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.536467 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-config-data-custom\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.537176 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-config\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.537285 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-dns-swift-storage-0\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.537406 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fmsh\" (UniqueName: \"kubernetes.io/projected/388a02e2-0a16-4e6b-9c80-f72cad62a370-kube-api-access-6fmsh\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.537428 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-ovsdbserver-sb\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.537547 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-config-data\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.537575 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-dns-svc\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.538855 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-ovsdbserver-nb\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.538885 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-dns-svc\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.539390 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-ovsdbserver-sb\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.541000 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-dns-swift-storage-0\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.543596 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.564412 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fmsh\" (UniqueName: \"kubernetes.io/projected/388a02e2-0a16-4e6b-9c80-f72cad62a370-kube-api-access-6fmsh\") pod \"dnsmasq-dns-58ccbc4c65-26lj7\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.604595 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.642483 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-config-data\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.642595 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhhzn\" (UniqueName: \"kubernetes.io/projected/74cab817-1ae2-4488-9a46-1377bf50d579-kube-api-access-nhhzn\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.642617 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74cab817-1ae2-4488-9a46-1377bf50d579-logs\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.642656 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-combined-ca-bundle\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.642695 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-config-data-custom\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.643373 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74cab817-1ae2-4488-9a46-1377bf50d579-logs\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.648063 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-config-data\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.648405 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-combined-ca-bundle\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.658927 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-config-data-custom\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.670077 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhhzn\" (UniqueName: \"kubernetes.io/projected/74cab817-1ae2-4488-9a46-1377bf50d579-kube-api-access-nhhzn\") pod \"barbican-api-5b958d6fb8-lff5x\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:55 crc kubenswrapper[4697]: I0220 16:51:55.839585 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.123609 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6bc54df884-mx794" podUID="215f7a56-10a1-4ae5-9071-4983dbb45b35" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.168:8443: connect: connection refused" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.127097 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7bf9cb54f8-bfxkg" podUID="d6667b0d-626d-4578-9767-2026d21b1583" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.167:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.167:8443: connect: connection refused" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.299478 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.299523 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.339106 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.339583 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.651300 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.651338 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.708010 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.742360 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.829458 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.829887 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.835185 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.941739 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5fddb45f9b-25kb9"] Feb 20 16:51:57 crc kubenswrapper[4697]: E0220 16:51:57.942146 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc1f573-c5c3-436b-9c10-820add978d5b" containerName="dnsmasq-dns" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.942158 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc1f573-c5c3-436b-9c10-820add978d5b" containerName="dnsmasq-dns" Feb 20 16:51:57 crc kubenswrapper[4697]: E0220 16:51:57.942168 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea295ce-e05d-46a9-9a39-94ffc4b29826" containerName="watcher-decision-engine" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.942173 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea295ce-e05d-46a9-9a39-94ffc4b29826" containerName="watcher-decision-engine" Feb 20 16:51:57 crc kubenswrapper[4697]: E0220 16:51:57.942184 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc1f573-c5c3-436b-9c10-820add978d5b" containerName="init" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.942191 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc1f573-c5c3-436b-9c10-820add978d5b" containerName="init" Feb 20 16:51:57 crc kubenswrapper[4697]: E0220 16:51:57.942201 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8abdcc99-bf45-4ca2-82b4-147b0a707333" containerName="watcher-applier" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.942207 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="8abdcc99-bf45-4ca2-82b4-147b0a707333" containerName="watcher-applier" Feb 20 16:51:57 crc kubenswrapper[4697]: E0220 16:51:57.942221 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea295ce-e05d-46a9-9a39-94ffc4b29826" containerName="watcher-decision-engine" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.942227 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea295ce-e05d-46a9-9a39-94ffc4b29826" containerName="watcher-decision-engine" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.942410 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea295ce-e05d-46a9-9a39-94ffc4b29826" containerName="watcher-decision-engine" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.942429 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc1f573-c5c3-436b-9c10-820add978d5b" containerName="dnsmasq-dns" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.942461 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="8abdcc99-bf45-4ca2-82b4-147b0a707333" containerName="watcher-applier" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.942799 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea295ce-e05d-46a9-9a39-94ffc4b29826" containerName="watcher-decision-engine" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.943406 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.951821 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 20 16:51:57 crc kubenswrapper[4697]: I0220 16:51:57.952003 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.016888 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-config\") pod \"8dc1f573-c5c3-436b-9c10-820add978d5b\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.016964 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8abdcc99-bf45-4ca2-82b4-147b0a707333-logs\") pod \"8abdcc99-bf45-4ca2-82b4-147b0a707333\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017068 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-custom-prometheus-ca\") pod \"aea295ce-e05d-46a9-9a39-94ffc4b29826\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017093 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbxhq\" (UniqueName: \"kubernetes.io/projected/aea295ce-e05d-46a9-9a39-94ffc4b29826-kube-api-access-tbxhq\") pod \"aea295ce-e05d-46a9-9a39-94ffc4b29826\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017119 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-config-data\") pod \"aea295ce-e05d-46a9-9a39-94ffc4b29826\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017147 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbbtp\" (UniqueName: \"kubernetes.io/projected/8abdcc99-bf45-4ca2-82b4-147b0a707333-kube-api-access-cbbtp\") pod \"8abdcc99-bf45-4ca2-82b4-147b0a707333\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017205 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brhb2\" (UniqueName: \"kubernetes.io/projected/8dc1f573-c5c3-436b-9c10-820add978d5b-kube-api-access-brhb2\") pod \"8dc1f573-c5c3-436b-9c10-820add978d5b\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017269 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-dns-svc\") pod \"8dc1f573-c5c3-436b-9c10-820add978d5b\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017299 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-combined-ca-bundle\") pod \"aea295ce-e05d-46a9-9a39-94ffc4b29826\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017339 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8abdcc99-bf45-4ca2-82b4-147b0a707333-combined-ca-bundle\") pod \"8abdcc99-bf45-4ca2-82b4-147b0a707333\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017399 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-ovsdbserver-nb\") pod \"8dc1f573-c5c3-436b-9c10-820add978d5b\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017420 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8abdcc99-bf45-4ca2-82b4-147b0a707333-config-data\") pod \"8abdcc99-bf45-4ca2-82b4-147b0a707333\" (UID: \"8abdcc99-bf45-4ca2-82b4-147b0a707333\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017469 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-dns-swift-storage-0\") pod \"8dc1f573-c5c3-436b-9c10-820add978d5b\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017491 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-ovsdbserver-sb\") pod \"8dc1f573-c5c3-436b-9c10-820add978d5b\" (UID: \"8dc1f573-c5c3-436b-9c10-820add978d5b\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.017521 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea295ce-e05d-46a9-9a39-94ffc4b29826-logs\") pod \"aea295ce-e05d-46a9-9a39-94ffc4b29826\" (UID: \"aea295ce-e05d-46a9-9a39-94ffc4b29826\") " Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.044557 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fddb45f9b-25kb9"] Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.056665 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aea295ce-e05d-46a9-9a39-94ffc4b29826-logs" (OuterVolumeSpecName: "logs") pod "aea295ce-e05d-46a9-9a39-94ffc4b29826" (UID: "aea295ce-e05d-46a9-9a39-94ffc4b29826"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.059140 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea295ce-e05d-46a9-9a39-94ffc4b29826-kube-api-access-tbxhq" (OuterVolumeSpecName: "kube-api-access-tbxhq") pod "aea295ce-e05d-46a9-9a39-94ffc4b29826" (UID: "aea295ce-e05d-46a9-9a39-94ffc4b29826"). InnerVolumeSpecName "kube-api-access-tbxhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.059355 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8abdcc99-bf45-4ca2-82b4-147b0a707333-logs" (OuterVolumeSpecName: "logs") pod "8abdcc99-bf45-4ca2-82b4-147b0a707333" (UID: "8abdcc99-bf45-4ca2-82b4-147b0a707333"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.071400 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8abdcc99-bf45-4ca2-82b4-147b0a707333-kube-api-access-cbbtp" (OuterVolumeSpecName: "kube-api-access-cbbtp") pod "8abdcc99-bf45-4ca2-82b4-147b0a707333" (UID: "8abdcc99-bf45-4ca2-82b4-147b0a707333"). InnerVolumeSpecName "kube-api-access-cbbtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.076760 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" event={"ID":"8dc1f573-c5c3-436b-9c10-820add978d5b","Type":"ContainerDied","Data":"2c172174b02dd5c372f5f17a2daae55b13462cb0a7b7e7c73170558ce3419c9e"} Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.076841 4697 scope.go:117] "RemoveContainer" containerID="20536d476bffd66b144be093eb27eb0aa93f06d37191f8f856c532b9ec985e0c" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.077053 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.110413 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"8abdcc99-bf45-4ca2-82b4-147b0a707333","Type":"ContainerDied","Data":"fe3228b2d1a472c18ab9b7a1c7805bb9948e2c1adfc9ec8e28e73d9ed7066b68"} Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.110597 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.164764 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9scm\" (UniqueName: \"kubernetes.io/projected/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-kube-api-access-r9scm\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.164876 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-config-data\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.164927 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-config-data-custom\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.165012 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-public-tls-certs\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.165165 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-logs\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.165231 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-combined-ca-bundle\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.165255 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-internal-tls-certs\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.165395 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea295ce-e05d-46a9-9a39-94ffc4b29826-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.165522 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8abdcc99-bf45-4ca2-82b4-147b0a707333-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.166035 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbxhq\" (UniqueName: \"kubernetes.io/projected/aea295ce-e05d-46a9-9a39-94ffc4b29826-kube-api-access-tbxhq\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.166495 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbbtp\" (UniqueName: \"kubernetes.io/projected/8abdcc99-bf45-4ca2-82b4-147b0a707333-kube-api-access-cbbtp\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.167140 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.169058 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.169080 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aea295ce-e05d-46a9-9a39-94ffc4b29826","Type":"ContainerDied","Data":"d96be22343c3b2e35336818b837c98edc74a777a76b0ff819329a21ddc3c4c3e"} Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.169100 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.169110 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.169205 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.175729 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc1f573-c5c3-436b-9c10-820add978d5b-kube-api-access-brhb2" (OuterVolumeSpecName: "kube-api-access-brhb2") pod "8dc1f573-c5c3-436b-9c10-820add978d5b" (UID: "8dc1f573-c5c3-436b-9c10-820add978d5b"). InnerVolumeSpecName "kube-api-access-brhb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.176347 4697 scope.go:117] "RemoveContainer" containerID="c9656f2774e350da3d1bb1d9a56a8c63cef06abda935efd0fee0b1a6d0dd296a" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.228577 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8abdcc99-bf45-4ca2-82b4-147b0a707333-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8abdcc99-bf45-4ca2-82b4-147b0a707333" (UID: "8abdcc99-bf45-4ca2-82b4-147b0a707333"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.228663 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aea295ce-e05d-46a9-9a39-94ffc4b29826" (UID: "aea295ce-e05d-46a9-9a39-94ffc4b29826"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.268174 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-config-data\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.268229 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-config-data-custom\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.268272 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-public-tls-certs\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.268325 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-logs\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.268361 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-combined-ca-bundle\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.268380 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-internal-tls-certs\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.268406 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9scm\" (UniqueName: \"kubernetes.io/projected/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-kube-api-access-r9scm\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.268488 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brhb2\" (UniqueName: \"kubernetes.io/projected/8dc1f573-c5c3-436b-9c10-820add978d5b-kube-api-access-brhb2\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.268499 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.268508 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8abdcc99-bf45-4ca2-82b4-147b0a707333-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.269615 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-logs\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.294245 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-config-data-custom\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.314512 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-internal-tls-certs\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.331827 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-public-tls-certs\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.332083 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8474b565df-7h82r"] Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.339399 4697 scope.go:117] "RemoveContainer" containerID="08f5a3c9502f8c0be5b1bb7c859e0acca1977fb8674d7f257dbc7a4c8d3eef43" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.339588 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-combined-ca-bundle\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.340283 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-config-data\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.357120 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9scm\" (UniqueName: \"kubernetes.io/projected/5fcf8d33-fdb6-43e6-aade-d7dc55b3848c-kube-api-access-r9scm\") pod \"barbican-api-5fddb45f9b-25kb9\" (UID: \"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c\") " pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.372940 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.416693 4697 scope.go:117] "RemoveContainer" containerID="49cb9fb17195ab4325f654d4c9d330b48ea9e9c735b527b73e1728e5326f1b97" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.438410 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8dc1f573-c5c3-436b-9c10-820add978d5b" (UID: "8dc1f573-c5c3-436b-9c10-820add978d5b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.464804 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85d8b4ccc6-tdklm"] Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.474720 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8abdcc99-bf45-4ca2-82b4-147b0a707333-config-data" (OuterVolumeSpecName: "config-data") pod "8abdcc99-bf45-4ca2-82b4-147b0a707333" (UID: "8abdcc99-bf45-4ca2-82b4-147b0a707333"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.475781 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8abdcc99-bf45-4ca2-82b4-147b0a707333-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.475795 4697 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: W0220 16:51:58.552779 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e1ff974_0f05_484c_b267_f1537ec9495e.slice/crio-93375f9b3263f6bfa6466650e7cb1d24ed630b80a477382b44d1fb24c8ce71c5 WatchSource:0}: Error finding container 93375f9b3263f6bfa6466650e7cb1d24ed630b80a477382b44d1fb24c8ce71c5: Status 404 returned error can't find the container with id 93375f9b3263f6bfa6466650e7cb1d24ed630b80a477382b44d1fb24c8ce71c5 Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.620197 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-674dd9ffc6-spdfl"] Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.646568 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b958d6fb8-lff5x"] Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.654024 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58ccbc4c65-26lj7"] Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.719381 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "aea295ce-e05d-46a9-9a39-94ffc4b29826" (UID: "aea295ce-e05d-46a9-9a39-94ffc4b29826"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.751202 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-config" (OuterVolumeSpecName: "config") pod "8dc1f573-c5c3-436b-9c10-820add978d5b" (UID: "8dc1f573-c5c3-436b-9c10-820add978d5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.788110 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.788402 4697 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.805716 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8dc1f573-c5c3-436b-9c10-820add978d5b" (UID: "8dc1f573-c5c3-436b-9c10-820add978d5b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.805890 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8dc1f573-c5c3-436b-9c10-820add978d5b" (UID: "8dc1f573-c5c3-436b-9c10-820add978d5b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.808007 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8dc1f573-c5c3-436b-9c10-820add978d5b" (UID: "8dc1f573-c5c3-436b-9c10-820add978d5b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.822582 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-config-data" (OuterVolumeSpecName: "config-data") pod "aea295ce-e05d-46a9-9a39-94ffc4b29826" (UID: "aea295ce-e05d-46a9-9a39-94ffc4b29826"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.903455 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea295ce-e05d-46a9-9a39-94ffc4b29826-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.903483 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.903492 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:58 crc kubenswrapper[4697]: I0220 16:51:58.903501 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dc1f573-c5c3-436b-9c10-820add978d5b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.219918 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.242030 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.243651 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b988e030-0e85-401e-bd22-057f9c2de43d","Type":"ContainerStarted","Data":"8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e"} Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.255925 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.256975 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b958d6fb8-lff5x" event={"ID":"74cab817-1ae2-4488-9a46-1377bf50d579","Type":"ContainerStarted","Data":"378bcda5d5b866811e18be7354d4796ab14daed12b283b9b2dc2169c0c96cd66"} Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.257085 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.259037 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.273293 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85d8b4ccc6-tdklm" event={"ID":"9e1ff974-0f05-484c-b267-f1537ec9495e","Type":"ContainerStarted","Data":"93375f9b3263f6bfa6466650e7cb1d24ed630b80a477382b44d1fb24c8ce71c5"} Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.284082 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8474b565df-7h82r" event={"ID":"9dc44740-061d-4c3a-9164-735d6da2dcf7","Type":"ContainerStarted","Data":"ddaefeb590ba686b9a50ce9efd97d87a5aa6748f51a6ba0239355119f358aaa1"} Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.296638 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d9f898cc5-9k7vr"] Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.296673 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" event={"ID":"388a02e2-0a16-4e6b-9c80-f72cad62a370","Type":"ContainerStarted","Data":"933e9552c801db7385d682e96b8c03d0705443c0741dca666c9fe383c1a12c5d"} Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.303732 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d9f898cc5-9k7vr"] Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.303887 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" event={"ID":"71a69046-b0f7-4c26-a941-aba4a9475d0a","Type":"ContainerStarted","Data":"24887d7c3e96a5d9c875b076978169eee712385e9fe5df2382991c07f7ee6522"} Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.314490 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.319679 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.349545 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.373007 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.379885 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.382619 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.389104 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.423837 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzc5k\" (UniqueName: \"kubernetes.io/projected/1e5e751a-8226-4445-87ff-2347c603df7c-kube-api-access-xzc5k\") pod \"watcher-applier-0\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.423898 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5e751a-8226-4445-87ff-2347c603df7c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.424008 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5e751a-8226-4445-87ff-2347c603df7c-logs\") pod \"watcher-applier-0\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.424129 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5e751a-8226-4445-87ff-2347c603df7c-config-data\") pod \"watcher-applier-0\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.428250 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fddb45f9b-25kb9"] Feb 20 16:51:59 crc kubenswrapper[4697]: W0220 16:51:59.433227 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fcf8d33_fdb6_43e6_aade_d7dc55b3848c.slice/crio-be439acbd6ec8472a2df4009fee073f10d7f3e55aaa74780b0cfbee6fe4495c3 WatchSource:0}: Error finding container be439acbd6ec8472a2df4009fee073f10d7f3e55aaa74780b0cfbee6fe4495c3: Status 404 returned error can't find the container with id be439acbd6ec8472a2df4009fee073f10d7f3e55aaa74780b0cfbee6fe4495c3 Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.525542 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.525596 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5e751a-8226-4445-87ff-2347c603df7c-logs\") pod \"watcher-applier-0\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.525764 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-logs\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.525832 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.526006 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5e751a-8226-4445-87ff-2347c603df7c-config-data\") pod \"watcher-applier-0\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.526049 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llg97\" (UniqueName: \"kubernetes.io/projected/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-kube-api-access-llg97\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.526072 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5e751a-8226-4445-87ff-2347c603df7c-logs\") pod \"watcher-applier-0\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.526203 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-config-data\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.526275 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzc5k\" (UniqueName: \"kubernetes.io/projected/1e5e751a-8226-4445-87ff-2347c603df7c-kube-api-access-xzc5k\") pod \"watcher-applier-0\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.526382 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5e751a-8226-4445-87ff-2347c603df7c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.532080 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5e751a-8226-4445-87ff-2347c603df7c-config-data\") pod \"watcher-applier-0\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.532154 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5e751a-8226-4445-87ff-2347c603df7c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.547049 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzc5k\" (UniqueName: \"kubernetes.io/projected/1e5e751a-8226-4445-87ff-2347c603df7c-kube-api-access-xzc5k\") pod \"watcher-applier-0\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.591005 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.628414 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-logs\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.628503 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.628562 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llg97\" (UniqueName: \"kubernetes.io/projected/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-kube-api-access-llg97\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.628611 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-config-data\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.628663 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.628980 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-logs\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.638125 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.639855 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-config-data\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.647378 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.667977 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llg97\" (UniqueName: \"kubernetes.io/projected/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-kube-api-access-llg97\") pod \"watcher-decision-engine-0\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:51:59 crc kubenswrapper[4697]: I0220 16:51:59.720973 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.360739 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85d8b4ccc6-tdklm" event={"ID":"9e1ff974-0f05-484c-b267-f1537ec9495e","Type":"ContainerStarted","Data":"f36cc1614bf4c38687ce8c7c82ea5fa468899adeddfeee3b173bbfe7623f0b25"} Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.362300 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.371317 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.374945 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b958d6fb8-lff5x" event={"ID":"74cab817-1ae2-4488-9a46-1377bf50d579","Type":"ContainerStarted","Data":"0e97746652449e49babc04c164c4b7c1830dfe71eaf0caec732cd4c31ebffda4"} Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.374979 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b958d6fb8-lff5x" event={"ID":"74cab817-1ae2-4488-9a46-1377bf50d579","Type":"ContainerStarted","Data":"f0adea8f73186007ac9ec77b593d4d32f1ff9a574813be46e3e427521776984c"} Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.375154 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.375276 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.385605 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-85d8b4ccc6-tdklm" podStartSLOduration=9.385593169 podStartE2EDuration="9.385593169s" podCreationTimestamp="2026-02-20 16:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:00.376466425 +0000 UTC m=+1228.156511833" watchObservedRunningTime="2026-02-20 16:52:00.385593169 +0000 UTC m=+1228.165638577" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.400666 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fddb45f9b-25kb9" event={"ID":"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c","Type":"ContainerStarted","Data":"0f0988571ce4564bc8bfc0aeffabd241f8b2e703747c18e7b936ecac126db681"} Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.400733 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fddb45f9b-25kb9" event={"ID":"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c","Type":"ContainerStarted","Data":"be439acbd6ec8472a2df4009fee073f10d7f3e55aaa74780b0cfbee6fe4495c3"} Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.403578 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5b958d6fb8-lff5x" podStartSLOduration=5.40356086 podStartE2EDuration="5.40356086s" podCreationTimestamp="2026-02-20 16:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:00.400934015 +0000 UTC m=+1228.180979423" watchObservedRunningTime="2026-02-20 16:52:00.40356086 +0000 UTC m=+1228.183606268" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.408675 4697 generic.go:334] "Generic (PLEG): container finished" podID="388a02e2-0a16-4e6b-9c80-f72cad62a370" containerID="7dec8c6b5ce098c88dfb4b604bc2234243247b64c9240daf2f3bd547e070e8a1" exitCode=0 Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.408768 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" event={"ID":"388a02e2-0a16-4e6b-9c80-f72cad62a370","Type":"ContainerDied","Data":"7dec8c6b5ce098c88dfb4b604bc2234243247b64c9240daf2f3bd547e070e8a1"} Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.419694 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.419719 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.420995 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dzcml" event={"ID":"4bc8e40c-20d6-41e9-9f4e-25112a77e115","Type":"ContainerStarted","Data":"d204f0b69a294f0c73e9d8e10974149f4b0bd36582ae08b87b1ebcae8059eacc"} Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.421296 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.421322 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.528799 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-dzcml" podStartSLOduration=6.387798861 podStartE2EDuration="53.528778285s" podCreationTimestamp="2026-02-20 16:51:07 +0000 UTC" firstStartedPulling="2026-02-20 16:51:10.702549661 +0000 UTC m=+1178.482595069" lastFinishedPulling="2026-02-20 16:51:57.843529085 +0000 UTC m=+1225.623574493" observedRunningTime="2026-02-20 16:52:00.462822875 +0000 UTC m=+1228.242868283" watchObservedRunningTime="2026-02-20 16:52:00.528778285 +0000 UTC m=+1228.308823693" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.603142 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.891648 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8abdcc99-bf45-4ca2-82b4-147b0a707333" path="/var/lib/kubelet/pods/8abdcc99-bf45-4ca2-82b4-147b0a707333/volumes" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.892407 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dc1f573-c5c3-436b-9c10-820add978d5b" path="/var/lib/kubelet/pods/8dc1f573-c5c3-436b-9c10-820add978d5b/volumes" Feb 20 16:52:00 crc kubenswrapper[4697]: I0220 16:52:00.892983 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea295ce-e05d-46a9-9a39-94ffc4b29826" path="/var/lib/kubelet/pods/aea295ce-e05d-46a9-9a39-94ffc4b29826/volumes" Feb 20 16:52:01 crc kubenswrapper[4697]: I0220 16:52:01.230888 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:52:01 crc kubenswrapper[4697]: I0220 16:52:01.231264 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" containerName="watcher-api-log" containerID="cri-o://9f8983e72087ad930ae19fdb48bd9e133d258ec0aee90328c0027fc97ab70b01" gracePeriod=30 Feb 20 16:52:01 crc kubenswrapper[4697]: I0220 16:52:01.231338 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" containerName="watcher-api" containerID="cri-o://31d7bab99a76b21dcde31195ee96704ee92e69d68ad7f7f423f41d161a7ad9ed" gracePeriod=30 Feb 20 16:52:01 crc kubenswrapper[4697]: I0220 16:52:01.431295 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fddb45f9b-25kb9" event={"ID":"5fcf8d33-fdb6-43e6-aade-d7dc55b3848c","Type":"ContainerStarted","Data":"0b264c651cad933e291ee959f68b8f6f7375d67ba58e4affda26a4a2c6425823"} Feb 20 16:52:01 crc kubenswrapper[4697]: I0220 16:52:01.432525 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:52:01 crc kubenswrapper[4697]: I0220 16:52:01.432553 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:52:01 crc kubenswrapper[4697]: I0220 16:52:01.434330 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e","Type":"ContainerStarted","Data":"4cefd8bf27bd23573d851816f6013edc60787f587573f377a33e6c9ef3e379a3"} Feb 20 16:52:01 crc kubenswrapper[4697]: I0220 16:52:01.437102 4697 generic.go:334] "Generic (PLEG): container finished" podID="cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" containerID="9f8983e72087ad930ae19fdb48bd9e133d258ec0aee90328c0027fc97ab70b01" exitCode=143 Feb 20 16:52:01 crc kubenswrapper[4697]: I0220 16:52:01.437136 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f","Type":"ContainerDied","Data":"9f8983e72087ad930ae19fdb48bd9e133d258ec0aee90328c0027fc97ab70b01"} Feb 20 16:52:01 crc kubenswrapper[4697]: I0220 16:52:01.438329 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"1e5e751a-8226-4445-87ff-2347c603df7c","Type":"ContainerStarted","Data":"5c51ca9b955c64a77b57a92ac8def5c6aca40c7b590478af1d754b6b68b45944"} Feb 20 16:52:01 crc kubenswrapper[4697]: I0220 16:52:01.464889 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5fddb45f9b-25kb9" podStartSLOduration=4.464871882 podStartE2EDuration="4.464871882s" podCreationTimestamp="2026-02-20 16:51:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:01.451074373 +0000 UTC m=+1229.231119791" watchObservedRunningTime="2026-02-20 16:52:01.464871882 +0000 UTC m=+1229.244917290" Feb 20 16:52:01 crc kubenswrapper[4697]: I0220 16:52:01.872037 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d9f898cc5-9k7vr" podUID="8dc1f573-c5c3-436b-9c10-820add978d5b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: i/o timeout" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.433885 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.434189 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.434285 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.435896 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.534340 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" event={"ID":"388a02e2-0a16-4e6b-9c80-f72cad62a370","Type":"ContainerStarted","Data":"55dcde7167e76b063abb1cba16c1428e8a382dcdc027781fbc3cb5bd5b5fc9d4"} Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.534678 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.548482 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e","Type":"ContainerStarted","Data":"bf201182bdd16e8f323bc9095059948530ad616d7108a16c235e3db97ee1f97f"} Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.564149 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" podStartSLOduration=7.564122366 podStartE2EDuration="7.564122366s" podCreationTimestamp="2026-02-20 16:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:02.555911715 +0000 UTC m=+1230.335957123" watchObservedRunningTime="2026-02-20 16:52:02.564122366 +0000 UTC m=+1230.344167774" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.581366 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.588722 4697 generic.go:334] "Generic (PLEG): container finished" podID="cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" containerID="31d7bab99a76b21dcde31195ee96704ee92e69d68ad7f7f423f41d161a7ad9ed" exitCode=0 Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.588774 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f","Type":"ContainerDied","Data":"31d7bab99a76b21dcde31195ee96704ee92e69d68ad7f7f423f41d161a7ad9ed"} Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.588959 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=3.588938686 podStartE2EDuration="3.588938686s" podCreationTimestamp="2026-02-20 16:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:02.581157124 +0000 UTC m=+1230.361202532" watchObservedRunningTime="2026-02-20 16:52:02.588938686 +0000 UTC m=+1230.368984094" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.589560 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.591318 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"1e5e751a-8226-4445-87ff-2347c603df7c","Type":"ContainerStarted","Data":"f8b52430b475ec36694d4d83616a15f714bcdd3a471de49f37d537b0228a5aa1"} Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.617953 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.660843 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.660827951 podStartE2EDuration="3.660827951s" podCreationTimestamp="2026-02-20 16:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:02.651271756 +0000 UTC m=+1230.431317164" watchObservedRunningTime="2026-02-20 16:52:02.660827951 +0000 UTC m=+1230.440873359" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.743121 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7t9g\" (UniqueName: \"kubernetes.io/projected/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-kube-api-access-d7t9g\") pod \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.743209 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-logs\") pod \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.743296 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-config-data\") pod \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.743334 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-combined-ca-bundle\") pod \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.743370 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-custom-prometheus-ca\") pod \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\" (UID: \"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f\") " Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.745217 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-logs" (OuterVolumeSpecName: "logs") pod "cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" (UID: "cfed00e9-45d6-47dc-8a93-5895a8bc1f8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.752550 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-kube-api-access-d7t9g" (OuterVolumeSpecName: "kube-api-access-d7t9g") pod "cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" (UID: "cfed00e9-45d6-47dc-8a93-5895a8bc1f8f"). InnerVolumeSpecName "kube-api-access-d7t9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.832769 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" (UID: "cfed00e9-45d6-47dc-8a93-5895a8bc1f8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.834580 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" (UID: "cfed00e9-45d6-47dc-8a93-5895a8bc1f8f"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.839829 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-config-data" (OuterVolumeSpecName: "config-data") pod "cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" (UID: "cfed00e9-45d6-47dc-8a93-5895a8bc1f8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.845935 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7t9g\" (UniqueName: \"kubernetes.io/projected/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-kube-api-access-d7t9g\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.845968 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.845982 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.845993 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:02 crc kubenswrapper[4697]: I0220 16:52:02.846003 4697 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.611513 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cfed00e9-45d6-47dc-8a93-5895a8bc1f8f","Type":"ContainerDied","Data":"ec81f4b7e8118af7a6197055f0e9dc477b78d595ffe5c5d303c4496620b57073"} Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.611952 4697 scope.go:117] "RemoveContainer" containerID="31d7bab99a76b21dcde31195ee96704ee92e69d68ad7f7f423f41d161a7ad9ed" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.611820 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.619732 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8474b565df-7h82r" event={"ID":"9dc44740-061d-4c3a-9164-735d6da2dcf7","Type":"ContainerStarted","Data":"d8268c5635916d22c5547ba2af91cb948ba219dc7feb98fc5f1e1168810f7873"} Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.619776 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8474b565df-7h82r" event={"ID":"9dc44740-061d-4c3a-9164-735d6da2dcf7","Type":"ContainerStarted","Data":"108f5811676be2bce36ec0a7e47968e5eb44efb5be065009a3d8895c7bcbad70"} Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.625605 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" event={"ID":"71a69046-b0f7-4c26-a941-aba4a9475d0a","Type":"ContainerStarted","Data":"9ae439dbbc3dc7234a495dacda7bde7c54cecacdff029c1d68dbce7815f85a9c"} Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.625647 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" event={"ID":"71a69046-b0f7-4c26-a941-aba4a9475d0a","Type":"ContainerStarted","Data":"ab0a240fe1aff8cd9c399dc6bbc21cbe4853d25c72053368cae123469e76ba05"} Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.658425 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.673816 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.710979 4697 scope.go:117] "RemoveContainer" containerID="9f8983e72087ad930ae19fdb48bd9e133d258ec0aee90328c0027fc97ab70b01" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.727544 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:52:03 crc kubenswrapper[4697]: E0220 16:52:03.727970 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" containerName="watcher-api" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.727982 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" containerName="watcher-api" Feb 20 16:52:03 crc kubenswrapper[4697]: E0220 16:52:03.728003 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" containerName="watcher-api-log" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.728009 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" containerName="watcher-api-log" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.728169 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" containerName="watcher-api" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.728193 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" containerName="watcher-api-log" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.729179 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.731779 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.733699 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.733923 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.735183 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-674dd9ffc6-spdfl" podStartSLOduration=5.25249251 podStartE2EDuration="8.735169452s" podCreationTimestamp="2026-02-20 16:51:55 +0000 UTC" firstStartedPulling="2026-02-20 16:51:58.754446554 +0000 UTC m=+1226.534491962" lastFinishedPulling="2026-02-20 16:52:02.237123496 +0000 UTC m=+1230.017168904" observedRunningTime="2026-02-20 16:52:03.65646392 +0000 UTC m=+1231.436509328" watchObservedRunningTime="2026-02-20 16:52:03.735169452 +0000 UTC m=+1231.515214860" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.748151 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.755528 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-8474b565df-7h82r" podStartSLOduration=4.927015248 podStartE2EDuration="8.755508532s" podCreationTimestamp="2026-02-20 16:51:55 +0000 UTC" firstStartedPulling="2026-02-20 16:51:58.414766263 +0000 UTC m=+1226.194811671" lastFinishedPulling="2026-02-20 16:52:02.243259547 +0000 UTC m=+1230.023304955" observedRunningTime="2026-02-20 16:52:03.67683273 +0000 UTC m=+1231.456878138" watchObservedRunningTime="2026-02-20 16:52:03.755508532 +0000 UTC m=+1231.535553950" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.871510 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.871591 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-config-data\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.871631 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d9f5b2-5617-4e75-97f2-f00513a5be67-logs\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.871649 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.871670 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg8nk\" (UniqueName: \"kubernetes.io/projected/f0d9f5b2-5617-4e75-97f2-f00513a5be67-kube-api-access-dg8nk\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.871696 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-public-tls-certs\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.871936 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.973244 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.973606 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg8nk\" (UniqueName: \"kubernetes.io/projected/f0d9f5b2-5617-4e75-97f2-f00513a5be67-kube-api-access-dg8nk\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.973634 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-public-tls-certs\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.973681 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.973751 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.973807 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-config-data\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.973844 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d9f5b2-5617-4e75-97f2-f00513a5be67-logs\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.974235 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d9f5b2-5617-4e75-97f2-f00513a5be67-logs\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.982903 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.983252 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-config-data\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.985593 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-public-tls-certs\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.985625 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.990320 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:03 crc kubenswrapper[4697]: I0220 16:52:03.991361 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg8nk\" (UniqueName: \"kubernetes.io/projected/f0d9f5b2-5617-4e75-97f2-f00513a5be67-kube-api-access-dg8nk\") pod \"watcher-api-0\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " pod="openstack/watcher-api-0" Feb 20 16:52:04 crc kubenswrapper[4697]: I0220 16:52:04.051549 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:52:04 crc kubenswrapper[4697]: I0220 16:52:04.594711 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 20 16:52:04 crc kubenswrapper[4697]: I0220 16:52:04.605491 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:52:04 crc kubenswrapper[4697]: I0220 16:52:04.655463 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f0d9f5b2-5617-4e75-97f2-f00513a5be67","Type":"ContainerStarted","Data":"1c4ae7101858cf283d4c95e48d311778ba207a3963e602fcf30f60474ed88985"} Feb 20 16:52:04 crc kubenswrapper[4697]: I0220 16:52:04.894418 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfed00e9-45d6-47dc-8a93-5895a8bc1f8f" path="/var/lib/kubelet/pods/cfed00e9-45d6-47dc-8a93-5895a8bc1f8f/volumes" Feb 20 16:52:05 crc kubenswrapper[4697]: I0220 16:52:05.670559 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f0d9f5b2-5617-4e75-97f2-f00513a5be67","Type":"ContainerStarted","Data":"6326de61069a59dad5f3e03f2dd41b7639ae730654c051a3708c8ceb947aada2"} Feb 20 16:52:05 crc kubenswrapper[4697]: I0220 16:52:05.670871 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f0d9f5b2-5617-4e75-97f2-f00513a5be67","Type":"ContainerStarted","Data":"7ba3aace87cb8a1a606f5248c0e0cf42eb249fb39aa36e0ff142c0657d42b5e4"} Feb 20 16:52:05 crc kubenswrapper[4697]: I0220 16:52:05.670987 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 20 16:52:05 crc kubenswrapper[4697]: I0220 16:52:05.693801 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.693783859 podStartE2EDuration="2.693783859s" podCreationTimestamp="2026-02-20 16:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:05.685953081 +0000 UTC m=+1233.465998489" watchObservedRunningTime="2026-02-20 16:52:05.693783859 +0000 UTC m=+1233.473829267" Feb 20 16:52:06 crc kubenswrapper[4697]: I0220 16:52:06.697006 4697 generic.go:334] "Generic (PLEG): container finished" podID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerID="bf201182bdd16e8f323bc9095059948530ad616d7108a16c235e3db97ee1f97f" exitCode=1 Feb 20 16:52:06 crc kubenswrapper[4697]: I0220 16:52:06.697084 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e","Type":"ContainerDied","Data":"bf201182bdd16e8f323bc9095059948530ad616d7108a16c235e3db97ee1f97f"} Feb 20 16:52:06 crc kubenswrapper[4697]: I0220 16:52:06.697932 4697 scope.go:117] "RemoveContainer" containerID="bf201182bdd16e8f323bc9095059948530ad616d7108a16c235e3db97ee1f97f" Feb 20 16:52:06 crc kubenswrapper[4697]: I0220 16:52:06.808746 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.037513 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57ffcc5f55-zw4vb"] Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.037811 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57ffcc5f55-zw4vb" podUID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" containerName="neutron-api" containerID="cri-o://898d85761d702f5464f68eb16f900f5a0d514f25a305e59f68a2d3aa7df28a68" gracePeriod=30 Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.038208 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57ffcc5f55-zw4vb" podUID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" containerName="neutron-httpd" containerID="cri-o://44f3535f183a3dc31ea58d908f7531be76c8b45705b172c9402ca3cbfc9e6204" gracePeriod=30 Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.102524 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76b77c89fc-t9rjg"] Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.104115 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.119579 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76b77c89fc-t9rjg"] Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.157745 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-57ffcc5f55-zw4vb" podUID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.173:9696/\": read tcp 10.217.0.2:41364->10.217.0.173:9696: read: connection reset by peer" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.288563 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxgws\" (UniqueName: \"kubernetes.io/projected/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-kube-api-access-fxgws\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.288607 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-ovndb-tls-certs\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.288632 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-public-tls-certs\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.288662 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-config\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.288687 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-combined-ca-bundle\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.288821 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-httpd-config\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.288845 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-internal-tls-certs\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.330963 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.390869 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-config\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.390935 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-combined-ca-bundle\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.391090 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-httpd-config\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.391125 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-internal-tls-certs\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.391160 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxgws\" (UniqueName: \"kubernetes.io/projected/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-kube-api-access-fxgws\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.391184 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-ovndb-tls-certs\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.391214 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-public-tls-certs\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.401299 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-ovndb-tls-certs\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.404172 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-combined-ca-bundle\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.410180 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-config\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.414772 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-httpd-config\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.415427 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-public-tls-certs\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.419824 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-internal-tls-certs\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.421167 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxgws\" (UniqueName: \"kubernetes.io/projected/0e0dd99f-4186-47a0-b6ba-c8d4abd040b0-kube-api-access-fxgws\") pod \"neutron-76b77c89fc-t9rjg\" (UID: \"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0\") " pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.439055 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.718269 4697 generic.go:334] "Generic (PLEG): container finished" podID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" containerID="44f3535f183a3dc31ea58d908f7531be76c8b45705b172c9402ca3cbfc9e6204" exitCode=0 Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.718347 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57ffcc5f55-zw4vb" event={"ID":"b3577c39-e7f5-4def-97f1-8e68e4bc07dd","Type":"ContainerDied","Data":"44f3535f183a3dc31ea58d908f7531be76c8b45705b172c9402ca3cbfc9e6204"} Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.722920 4697 generic.go:334] "Generic (PLEG): container finished" podID="4bc8e40c-20d6-41e9-9f4e-25112a77e115" containerID="d204f0b69a294f0c73e9d8e10974149f4b0bd36582ae08b87b1ebcae8059eacc" exitCode=0 Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.723020 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dzcml" event={"ID":"4bc8e40c-20d6-41e9-9f4e-25112a77e115","Type":"ContainerDied","Data":"d204f0b69a294f0c73e9d8e10974149f4b0bd36582ae08b87b1ebcae8059eacc"} Feb 20 16:52:07 crc kubenswrapper[4697]: I0220 16:52:07.757279 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:52:08 crc kubenswrapper[4697]: I0220 16:52:08.596914 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 20 16:52:09 crc kubenswrapper[4697]: I0220 16:52:09.052702 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 20 16:52:09 crc kubenswrapper[4697]: I0220 16:52:09.138342 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-57ffcc5f55-zw4vb" podUID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.173:9696/\": dial tcp 10.217.0.173:9696: connect: connection refused" Feb 20 16:52:09 crc kubenswrapper[4697]: I0220 16:52:09.592941 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 20 16:52:09 crc kubenswrapper[4697]: I0220 16:52:09.625744 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 20 16:52:09 crc kubenswrapper[4697]: I0220 16:52:09.722546 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 20 16:52:09 crc kubenswrapper[4697]: I0220 16:52:09.722591 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 20 16:52:09 crc kubenswrapper[4697]: I0220 16:52:09.783600 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 20 16:52:10 crc kubenswrapper[4697]: I0220 16:52:10.603761 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:52:10 crc kubenswrapper[4697]: I0220 16:52:10.607576 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:52:10 crc kubenswrapper[4697]: I0220 16:52:10.693545 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ddbff5fc9-jlmsc"] Feb 20 16:52:10 crc kubenswrapper[4697]: I0220 16:52:10.693862 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" podUID="21ea8664-e35f-40a2-9f86-391033cfc7bd" containerName="dnsmasq-dns" containerID="cri-o://628bafb9542b89c830ba13bcb4462d7a2f72ae2dc774d358da8bdcae5ac6c56f" gracePeriod=10 Feb 20 16:52:10 crc kubenswrapper[4697]: E0220 16:52:10.904263 4697 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21ea8664_e35f_40a2_9f86_391033cfc7bd.slice/crio-conmon-628bafb9542b89c830ba13bcb4462d7a2f72ae2dc774d358da8bdcae5ac6c56f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21ea8664_e35f_40a2_9f86_391033cfc7bd.slice/crio-628bafb9542b89c830ba13bcb4462d7a2f72ae2dc774d358da8bdcae5ac6c56f.scope\": RecentStats: unable to find data in memory cache]" Feb 20 16:52:11 crc kubenswrapper[4697]: I0220 16:52:11.241687 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:52:11 crc kubenswrapper[4697]: I0220 16:52:11.304817 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:52:11 crc kubenswrapper[4697]: I0220 16:52:11.708533 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" podUID="21ea8664-e35f-40a2-9f86-391033cfc7bd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: connect: connection refused" Feb 20 16:52:11 crc kubenswrapper[4697]: I0220 16:52:11.764994 4697 generic.go:334] "Generic (PLEG): container finished" podID="21ea8664-e35f-40a2-9f86-391033cfc7bd" containerID="628bafb9542b89c830ba13bcb4462d7a2f72ae2dc774d358da8bdcae5ac6c56f" exitCode=0 Feb 20 16:52:11 crc kubenswrapper[4697]: I0220 16:52:11.765033 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" event={"ID":"21ea8664-e35f-40a2-9f86-391033cfc7bd","Type":"ContainerDied","Data":"628bafb9542b89c830ba13bcb4462d7a2f72ae2dc774d358da8bdcae5ac6c56f"} Feb 20 16:52:11 crc kubenswrapper[4697]: I0220 16:52:11.814653 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fddb45f9b-25kb9" Feb 20 16:52:11 crc kubenswrapper[4697]: I0220 16:52:11.887844 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5b958d6fb8-lff5x"] Feb 20 16:52:11 crc kubenswrapper[4697]: I0220 16:52:11.888225 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5b958d6fb8-lff5x" podUID="74cab817-1ae2-4488-9a46-1377bf50d579" containerName="barbican-api-log" containerID="cri-o://f0adea8f73186007ac9ec77b593d4d32f1ff9a574813be46e3e427521776984c" gracePeriod=30 Feb 20 16:52:11 crc kubenswrapper[4697]: I0220 16:52:11.888889 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5b958d6fb8-lff5x" podUID="74cab817-1ae2-4488-9a46-1377bf50d579" containerName="barbican-api" containerID="cri-o://0e97746652449e49babc04c164c4b7c1830dfe71eaf0caec732cd4c31ebffda4" gracePeriod=30 Feb 20 16:52:11 crc kubenswrapper[4697]: I0220 16:52:11.903904 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b958d6fb8-lff5x" podUID="74cab817-1ae2-4488-9a46-1377bf50d579" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.181:9311/healthcheck\": EOF" Feb 20 16:52:12 crc kubenswrapper[4697]: I0220 16:52:12.858620 4697 generic.go:334] "Generic (PLEG): container finished" podID="74cab817-1ae2-4488-9a46-1377bf50d579" containerID="f0adea8f73186007ac9ec77b593d4d32f1ff9a574813be46e3e427521776984c" exitCode=143 Feb 20 16:52:12 crc kubenswrapper[4697]: I0220 16:52:12.859153 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b958d6fb8-lff5x" event={"ID":"74cab817-1ae2-4488-9a46-1377bf50d579","Type":"ContainerDied","Data":"f0adea8f73186007ac9ec77b593d4d32f1ff9a574813be46e3e427521776984c"} Feb 20 16:52:12 crc kubenswrapper[4697]: I0220 16:52:12.875002 4697 generic.go:334] "Generic (PLEG): container finished" podID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" containerID="898d85761d702f5464f68eb16f900f5a0d514f25a305e59f68a2d3aa7df28a68" exitCode=0 Feb 20 16:52:12 crc kubenswrapper[4697]: I0220 16:52:12.875043 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57ffcc5f55-zw4vb" event={"ID":"b3577c39-e7f5-4def-97f1-8e68e4bc07dd","Type":"ContainerDied","Data":"898d85761d702f5464f68eb16f900f5a0d514f25a305e59f68a2d3aa7df28a68"} Feb 20 16:52:13 crc kubenswrapper[4697]: I0220 16:52:13.430898 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:52:13 crc kubenswrapper[4697]: I0220 16:52:13.745797 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6bc54df884-mx794" Feb 20 16:52:13 crc kubenswrapper[4697]: I0220 16:52:13.802934 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bf9cb54f8-bfxkg"] Feb 20 16:52:13 crc kubenswrapper[4697]: I0220 16:52:13.882026 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7bf9cb54f8-bfxkg" podUID="d6667b0d-626d-4578-9767-2026d21b1583" containerName="horizon-log" containerID="cri-o://6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d" gracePeriod=30 Feb 20 16:52:13 crc kubenswrapper[4697]: I0220 16:52:13.882388 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7bf9cb54f8-bfxkg" podUID="d6667b0d-626d-4578-9767-2026d21b1583" containerName="horizon" containerID="cri-o://29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7" gracePeriod=30 Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.053482 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.074578 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.215482 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dzcml" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.217959 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.247115 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.313263 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78hp4\" (UniqueName: \"kubernetes.io/projected/21ea8664-e35f-40a2-9f86-391033cfc7bd-kube-api-access-78hp4\") pod \"21ea8664-e35f-40a2-9f86-391033cfc7bd\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.313315 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-ovndb-tls-certs\") pod \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.314670 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-config\") pod \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.314733 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g885n\" (UniqueName: \"kubernetes.io/projected/4bc8e40c-20d6-41e9-9f4e-25112a77e115-kube-api-access-g885n\") pod \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.314802 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-public-tls-certs\") pod \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.314832 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-combined-ca-bundle\") pod \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.314903 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-ovsdbserver-nb\") pod \"21ea8664-e35f-40a2-9f86-391033cfc7bd\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.314994 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-internal-tls-certs\") pod \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.315082 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-db-sync-config-data\") pod \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.315128 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jvsl\" (UniqueName: \"kubernetes.io/projected/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-kube-api-access-5jvsl\") pod \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.315156 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-httpd-config\") pod \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\" (UID: \"b3577c39-e7f5-4def-97f1-8e68e4bc07dd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.315233 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-dns-swift-storage-0\") pod \"21ea8664-e35f-40a2-9f86-391033cfc7bd\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.315283 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-scripts\") pod \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.315311 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-config\") pod \"21ea8664-e35f-40a2-9f86-391033cfc7bd\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.316332 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-combined-ca-bundle\") pod \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.316565 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bc8e40c-20d6-41e9-9f4e-25112a77e115-etc-machine-id\") pod \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.316595 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-dns-svc\") pod \"21ea8664-e35f-40a2-9f86-391033cfc7bd\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.316648 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-config-data\") pod \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\" (UID: \"4bc8e40c-20d6-41e9-9f4e-25112a77e115\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.316676 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-ovsdbserver-sb\") pod \"21ea8664-e35f-40a2-9f86-391033cfc7bd\" (UID: \"21ea8664-e35f-40a2-9f86-391033cfc7bd\") " Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.322582 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ea8664-e35f-40a2-9f86-391033cfc7bd-kube-api-access-78hp4" (OuterVolumeSpecName: "kube-api-access-78hp4") pod "21ea8664-e35f-40a2-9f86-391033cfc7bd" (UID: "21ea8664-e35f-40a2-9f86-391033cfc7bd"). InnerVolumeSpecName "kube-api-access-78hp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.335578 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bc8e40c-20d6-41e9-9f4e-25112a77e115-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4bc8e40c-20d6-41e9-9f4e-25112a77e115" (UID: "4bc8e40c-20d6-41e9-9f4e-25112a77e115"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.336555 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b3577c39-e7f5-4def-97f1-8e68e4bc07dd" (UID: "b3577c39-e7f5-4def-97f1-8e68e4bc07dd"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.341928 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4bc8e40c-20d6-41e9-9f4e-25112a77e115" (UID: "4bc8e40c-20d6-41e9-9f4e-25112a77e115"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.349586 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-kube-api-access-5jvsl" (OuterVolumeSpecName: "kube-api-access-5jvsl") pod "b3577c39-e7f5-4def-97f1-8e68e4bc07dd" (UID: "b3577c39-e7f5-4def-97f1-8e68e4bc07dd"). InnerVolumeSpecName "kube-api-access-5jvsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.361233 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-scripts" (OuterVolumeSpecName: "scripts") pod "4bc8e40c-20d6-41e9-9f4e-25112a77e115" (UID: "4bc8e40c-20d6-41e9-9f4e-25112a77e115"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.365461 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc8e40c-20d6-41e9-9f4e-25112a77e115-kube-api-access-g885n" (OuterVolumeSpecName: "kube-api-access-g885n") pod "4bc8e40c-20d6-41e9-9f4e-25112a77e115" (UID: "4bc8e40c-20d6-41e9-9f4e-25112a77e115"). InnerVolumeSpecName "kube-api-access-g885n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.419032 4697 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bc8e40c-20d6-41e9-9f4e-25112a77e115-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.419069 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78hp4\" (UniqueName: \"kubernetes.io/projected/21ea8664-e35f-40a2-9f86-391033cfc7bd-kube-api-access-78hp4\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.419086 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g885n\" (UniqueName: \"kubernetes.io/projected/4bc8e40c-20d6-41e9-9f4e-25112a77e115-kube-api-access-g885n\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.419099 4697 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.419111 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jvsl\" (UniqueName: \"kubernetes.io/projected/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-kube-api-access-5jvsl\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.419123 4697 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.419133 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.442149 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21ea8664-e35f-40a2-9f86-391033cfc7bd" (UID: "21ea8664-e35f-40a2-9f86-391033cfc7bd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.451886 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b3577c39-e7f5-4def-97f1-8e68e4bc07dd" (UID: "b3577c39-e7f5-4def-97f1-8e68e4bc07dd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.461252 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21ea8664-e35f-40a2-9f86-391033cfc7bd" (UID: "21ea8664-e35f-40a2-9f86-391033cfc7bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.462183 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-config" (OuterVolumeSpecName: "config") pod "b3577c39-e7f5-4def-97f1-8e68e4bc07dd" (UID: "b3577c39-e7f5-4def-97f1-8e68e4bc07dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.479983 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bc8e40c-20d6-41e9-9f4e-25112a77e115" (UID: "4bc8e40c-20d6-41e9-9f4e-25112a77e115"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.482090 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21ea8664-e35f-40a2-9f86-391033cfc7bd" (UID: "21ea8664-e35f-40a2-9f86-391033cfc7bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.494041 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-config-data" (OuterVolumeSpecName: "config-data") pod "4bc8e40c-20d6-41e9-9f4e-25112a77e115" (UID: "4bc8e40c-20d6-41e9-9f4e-25112a77e115"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.521351 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.521616 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.521694 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.521775 4697 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.521840 4697 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.521903 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc8e40c-20d6-41e9-9f4e-25112a77e115-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.521968 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.537657 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21ea8664-e35f-40a2-9f86-391033cfc7bd" (UID: "21ea8664-e35f-40a2-9f86-391033cfc7bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.539905 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3577c39-e7f5-4def-97f1-8e68e4bc07dd" (UID: "b3577c39-e7f5-4def-97f1-8e68e4bc07dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.541081 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-config" (OuterVolumeSpecName: "config") pod "21ea8664-e35f-40a2-9f86-391033cfc7bd" (UID: "21ea8664-e35f-40a2-9f86-391033cfc7bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.550647 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b3577c39-e7f5-4def-97f1-8e68e4bc07dd" (UID: "b3577c39-e7f5-4def-97f1-8e68e4bc07dd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.584063 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b3577c39-e7f5-4def-97f1-8e68e4bc07dd" (UID: "b3577c39-e7f5-4def-97f1-8e68e4bc07dd"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.624183 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.624230 4697 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.624238 4697 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.624246 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3577c39-e7f5-4def-97f1-8e68e4bc07dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.624257 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ea8664-e35f-40a2-9f86-391033cfc7bd-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.637347 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76b77c89fc-t9rjg"] Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.904462 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" event={"ID":"21ea8664-e35f-40a2-9f86-391033cfc7bd","Type":"ContainerDied","Data":"a83caae4db568755ba5b8dc9f3f5d92e52443c42f4f646003d46392d5e4949d2"} Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.904513 4697 scope.go:117] "RemoveContainer" containerID="628bafb9542b89c830ba13bcb4462d7a2f72ae2dc774d358da8bdcae5ac6c56f" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.904585 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ddbff5fc9-jlmsc" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.910464 4697 generic.go:334] "Generic (PLEG): container finished" podID="74cab817-1ae2-4488-9a46-1377bf50d579" containerID="0e97746652449e49babc04c164c4b7c1830dfe71eaf0caec732cd4c31ebffda4" exitCode=0 Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.910516 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b958d6fb8-lff5x" event={"ID":"74cab817-1ae2-4488-9a46-1377bf50d579","Type":"ContainerDied","Data":"0e97746652449e49babc04c164c4b7c1830dfe71eaf0caec732cd4c31ebffda4"} Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.913753 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e","Type":"ContainerStarted","Data":"12cfced469d6bd5f7a254e88c6a9df5e4c3628cc075e1a01ed3fd25baff46ef4"} Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.917749 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57ffcc5f55-zw4vb" event={"ID":"b3577c39-e7f5-4def-97f1-8e68e4bc07dd","Type":"ContainerDied","Data":"e8d65aa10484deba9debd0e779071014a8420b3cc17646732b3392604ebcda18"} Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.917814 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57ffcc5f55-zw4vb" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.924598 4697 generic.go:334] "Generic (PLEG): container finished" podID="38688b72-7aff-4efe-988f-aa147e5865e2" containerID="db9c2ca261106d6c4e7feae41bab1d47736f748ad72e69b7693169273a9162f7" exitCode=137 Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.924626 4697 generic.go:334] "Generic (PLEG): container finished" podID="38688b72-7aff-4efe-988f-aa147e5865e2" containerID="5c904e098eff1f50369dbb1b6e8a680e4219dcb312235c69580d11495a620313" exitCode=137 Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.924664 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-679c794b85-nbtsc" event={"ID":"38688b72-7aff-4efe-988f-aa147e5865e2","Type":"ContainerDied","Data":"db9c2ca261106d6c4e7feae41bab1d47736f748ad72e69b7693169273a9162f7"} Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.924687 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-679c794b85-nbtsc" event={"ID":"38688b72-7aff-4efe-988f-aa147e5865e2","Type":"ContainerDied","Data":"5c904e098eff1f50369dbb1b6e8a680e4219dcb312235c69580d11495a620313"} Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.931258 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dzcml" event={"ID":"4bc8e40c-20d6-41e9-9f4e-25112a77e115","Type":"ContainerDied","Data":"74657d3f189c4afdb55290a623f314e1b7eec4efe43a1a5dc188607aebcafe58"} Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.931293 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74657d3f189c4afdb55290a623f314e1b7eec4efe43a1a5dc188607aebcafe58" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.931412 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dzcml" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.937950 4697 generic.go:334] "Generic (PLEG): container finished" podID="d6667b0d-626d-4578-9767-2026d21b1583" containerID="29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7" exitCode=0 Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.938867 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bf9cb54f8-bfxkg" event={"ID":"d6667b0d-626d-4578-9767-2026d21b1583","Type":"ContainerDied","Data":"29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7"} Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.952458 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ddbff5fc9-jlmsc"] Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.960706 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.962944 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ddbff5fc9-jlmsc"] Feb 20 16:52:14 crc kubenswrapper[4697]: I0220 16:52:14.991256 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57ffcc5f55-zw4vb"] Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.002508 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-57ffcc5f55-zw4vb"] Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.124097 4697 scope.go:117] "RemoveContainer" containerID="3173ad1cceb9dfcdf17d2bb161110a9af4927e082a45a35d2d9cd40241c69c84" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.282145 4697 scope.go:117] "RemoveContainer" containerID="44f3535f183a3dc31ea58d908f7531be76c8b45705b172c9402ca3cbfc9e6204" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.407692 4697 scope.go:117] "RemoveContainer" containerID="898d85761d702f5464f68eb16f900f5a0d514f25a305e59f68a2d3aa7df28a68" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.524288 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.618706 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.620303 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 16:52:15 crc kubenswrapper[4697]: E0220 16:52:15.620814 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" containerName="neutron-api" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.620830 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" containerName="neutron-api" Feb 20 16:52:15 crc kubenswrapper[4697]: E0220 16:52:15.620843 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38688b72-7aff-4efe-988f-aa147e5865e2" containerName="horizon" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.620849 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="38688b72-7aff-4efe-988f-aa147e5865e2" containerName="horizon" Feb 20 16:52:15 crc kubenswrapper[4697]: E0220 16:52:15.620859 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74cab817-1ae2-4488-9a46-1377bf50d579" containerName="barbican-api-log" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.620864 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="74cab817-1ae2-4488-9a46-1377bf50d579" containerName="barbican-api-log" Feb 20 16:52:15 crc kubenswrapper[4697]: E0220 16:52:15.620876 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74cab817-1ae2-4488-9a46-1377bf50d579" containerName="barbican-api" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.620917 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="74cab817-1ae2-4488-9a46-1377bf50d579" containerName="barbican-api" Feb 20 16:52:15 crc kubenswrapper[4697]: E0220 16:52:15.620933 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc8e40c-20d6-41e9-9f4e-25112a77e115" containerName="cinder-db-sync" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.620939 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc8e40c-20d6-41e9-9f4e-25112a77e115" containerName="cinder-db-sync" Feb 20 16:52:15 crc kubenswrapper[4697]: E0220 16:52:15.621019 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" containerName="neutron-httpd" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.621027 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" containerName="neutron-httpd" Feb 20 16:52:15 crc kubenswrapper[4697]: E0220 16:52:15.621038 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38688b72-7aff-4efe-988f-aa147e5865e2" containerName="horizon-log" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.621045 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="38688b72-7aff-4efe-988f-aa147e5865e2" containerName="horizon-log" Feb 20 16:52:15 crc kubenswrapper[4697]: E0220 16:52:15.621051 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ea8664-e35f-40a2-9f86-391033cfc7bd" containerName="dnsmasq-dns" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.621057 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ea8664-e35f-40a2-9f86-391033cfc7bd" containerName="dnsmasq-dns" Feb 20 16:52:15 crc kubenswrapper[4697]: E0220 16:52:15.621111 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ea8664-e35f-40a2-9f86-391033cfc7bd" containerName="init" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.621117 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ea8664-e35f-40a2-9f86-391033cfc7bd" containerName="init" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.621324 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="74cab817-1ae2-4488-9a46-1377bf50d579" containerName="barbican-api" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.621345 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="74cab817-1ae2-4488-9a46-1377bf50d579" containerName="barbican-api-log" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.621352 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" containerName="neutron-httpd" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.621367 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ea8664-e35f-40a2-9f86-391033cfc7bd" containerName="dnsmasq-dns" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.621375 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc8e40c-20d6-41e9-9f4e-25112a77e115" containerName="cinder-db-sync" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.621384 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" containerName="neutron-api" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.621396 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="38688b72-7aff-4efe-988f-aa147e5865e2" containerName="horizon" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.621463 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="38688b72-7aff-4efe-988f-aa147e5865e2" containerName="horizon-log" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.623794 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.631450 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.631625 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.631730 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m2cjp" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.632397 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.668753 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.669494 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38688b72-7aff-4efe-988f-aa147e5865e2-logs\") pod \"38688b72-7aff-4efe-988f-aa147e5865e2\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.669590 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38688b72-7aff-4efe-988f-aa147e5865e2-config-data\") pod \"38688b72-7aff-4efe-988f-aa147e5865e2\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.669685 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/38688b72-7aff-4efe-988f-aa147e5865e2-horizon-secret-key\") pod \"38688b72-7aff-4efe-988f-aa147e5865e2\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.669703 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqsjj\" (UniqueName: \"kubernetes.io/projected/38688b72-7aff-4efe-988f-aa147e5865e2-kube-api-access-dqsjj\") pod \"38688b72-7aff-4efe-988f-aa147e5865e2\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.669786 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38688b72-7aff-4efe-988f-aa147e5865e2-scripts\") pod \"38688b72-7aff-4efe-988f-aa147e5865e2\" (UID: \"38688b72-7aff-4efe-988f-aa147e5865e2\") " Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.670843 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38688b72-7aff-4efe-988f-aa147e5865e2-logs" (OuterVolumeSpecName: "logs") pod "38688b72-7aff-4efe-988f-aa147e5865e2" (UID: "38688b72-7aff-4efe-988f-aa147e5865e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.676605 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38688b72-7aff-4efe-988f-aa147e5865e2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "38688b72-7aff-4efe-988f-aa147e5865e2" (UID: "38688b72-7aff-4efe-988f-aa147e5865e2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.677105 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38688b72-7aff-4efe-988f-aa147e5865e2-kube-api-access-dqsjj" (OuterVolumeSpecName: "kube-api-access-dqsjj") pod "38688b72-7aff-4efe-988f-aa147e5865e2" (UID: "38688b72-7aff-4efe-988f-aa147e5865e2"). InnerVolumeSpecName "kube-api-access-dqsjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.705194 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38688b72-7aff-4efe-988f-aa147e5865e2-config-data" (OuterVolumeSpecName: "config-data") pod "38688b72-7aff-4efe-988f-aa147e5865e2" (UID: "38688b72-7aff-4efe-988f-aa147e5865e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.728108 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b5bbf5bf5-dr9md"] Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.729713 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.739751 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b5bbf5bf5-dr9md"] Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.771428 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74cab817-1ae2-4488-9a46-1377bf50d579-logs\") pod \"74cab817-1ae2-4488-9a46-1377bf50d579\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.771637 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-config-data\") pod \"74cab817-1ae2-4488-9a46-1377bf50d579\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.771729 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-config-data-custom\") pod \"74cab817-1ae2-4488-9a46-1377bf50d579\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.771798 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhhzn\" (UniqueName: \"kubernetes.io/projected/74cab817-1ae2-4488-9a46-1377bf50d579-kube-api-access-nhhzn\") pod \"74cab817-1ae2-4488-9a46-1377bf50d579\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.773306 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-combined-ca-bundle\") pod \"74cab817-1ae2-4488-9a46-1377bf50d579\" (UID: \"74cab817-1ae2-4488-9a46-1377bf50d579\") " Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.774490 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74cab817-1ae2-4488-9a46-1377bf50d579-logs" (OuterVolumeSpecName: "logs") pod "74cab817-1ae2-4488-9a46-1377bf50d579" (UID: "74cab817-1ae2-4488-9a46-1377bf50d579"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.778144 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "74cab817-1ae2-4488-9a46-1377bf50d579" (UID: "74cab817-1ae2-4488-9a46-1377bf50d579"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.778488 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74cab817-1ae2-4488-9a46-1377bf50d579-kube-api-access-nhhzn" (OuterVolumeSpecName: "kube-api-access-nhhzn") pod "74cab817-1ae2-4488-9a46-1377bf50d579" (UID: "74cab817-1ae2-4488-9a46-1377bf50d579"). InnerVolumeSpecName "kube-api-access-nhhzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.779126 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a8af28c-1b6b-4137-9d89-439c6f66e980-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.779279 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-config-data\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.779637 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.779827 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-scripts\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.779848 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.779886 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8vrc\" (UniqueName: \"kubernetes.io/projected/7a8af28c-1b6b-4137-9d89-439c6f66e980-kube-api-access-k8vrc\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.780149 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74cab817-1ae2-4488-9a46-1377bf50d579-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.780167 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38688b72-7aff-4efe-988f-aa147e5865e2-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.780177 4697 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.780188 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhhzn\" (UniqueName: \"kubernetes.io/projected/74cab817-1ae2-4488-9a46-1377bf50d579-kube-api-access-nhhzn\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.780197 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38688b72-7aff-4efe-988f-aa147e5865e2-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.780206 4697 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/38688b72-7aff-4efe-988f-aa147e5865e2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.780214 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqsjj\" (UniqueName: \"kubernetes.io/projected/38688b72-7aff-4efe-988f-aa147e5865e2-kube-api-access-dqsjj\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.795751 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38688b72-7aff-4efe-988f-aa147e5865e2-scripts" (OuterVolumeSpecName: "scripts") pod "38688b72-7aff-4efe-988f-aa147e5865e2" (UID: "38688b72-7aff-4efe-988f-aa147e5865e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.847847 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74cab817-1ae2-4488-9a46-1377bf50d579" (UID: "74cab817-1ae2-4488-9a46-1377bf50d579"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.869948 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-config-data" (OuterVolumeSpecName: "config-data") pod "74cab817-1ae2-4488-9a46-1377bf50d579" (UID: "74cab817-1ae2-4488-9a46-1377bf50d579"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.886626 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887104 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-scripts\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887135 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8vrc\" (UniqueName: \"kubernetes.io/projected/7a8af28c-1b6b-4137-9d89-439c6f66e980-kube-api-access-k8vrc\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887170 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25dm9\" (UniqueName: \"kubernetes.io/projected/aeadc920-c9f4-4965-9b75-a233291a3f6a-kube-api-access-25dm9\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887210 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-dns-swift-storage-0\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887244 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a8af28c-1b6b-4137-9d89-439c6f66e980-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887258 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-config-data\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887289 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-ovsdbserver-sb\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887316 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-ovsdbserver-nb\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887342 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-config\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887358 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-dns-svc\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887384 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887469 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887484 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38688b72-7aff-4efe-988f-aa147e5865e2-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887493 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74cab817-1ae2-4488-9a46-1377bf50d579-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.887952 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a8af28c-1b6b-4137-9d89-439c6f66e980-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.891970 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.892865 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-scripts\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.895085 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.895885 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-config-data\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.913759 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.916056 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.926869 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.929722 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.938562 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8vrc\" (UniqueName: \"kubernetes.io/projected/7a8af28c-1b6b-4137-9d89-439c6f66e980-kube-api-access-k8vrc\") pod \"cinder-scheduler-0\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.972537 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.975496 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b988e030-0e85-401e-bd22-057f9c2de43d","Type":"ContainerStarted","Data":"ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452"} Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.975655 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="ceilometer-central-agent" containerID="cri-o://deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04" gracePeriod=30 Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.975727 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.976070 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="proxy-httpd" containerID="cri-o://ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452" gracePeriod=30 Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.976158 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="sg-core" containerID="cri-o://8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e" gracePeriod=30 Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.976212 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="ceilometer-notification-agent" containerID="cri-o://34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be" gracePeriod=30 Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.990685 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7eaf42b1-80b0-43d3-8820-0b17637d46a6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.990725 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-ovsdbserver-sb\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.990754 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-ovsdbserver-nb\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.990786 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-config\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.990801 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-dns-svc\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.990870 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pg6l\" (UniqueName: \"kubernetes.io/projected/7eaf42b1-80b0-43d3-8820-0b17637d46a6-kube-api-access-5pg6l\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.990888 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-config-data\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.990927 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-config-data-custom\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.990956 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25dm9\" (UniqueName: \"kubernetes.io/projected/aeadc920-c9f4-4965-9b75-a233291a3f6a-kube-api-access-25dm9\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.990977 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-scripts\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.991016 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-dns-swift-storage-0\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.991044 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.991062 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eaf42b1-80b0-43d3-8820-0b17637d46a6-logs\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.992804 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-ovsdbserver-sb\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.993314 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-dns-swift-storage-0\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.993328 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-ovsdbserver-nb\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.993846 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-config\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.994247 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-dns-svc\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.994945 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b958d6fb8-lff5x" event={"ID":"74cab817-1ae2-4488-9a46-1377bf50d579","Type":"ContainerDied","Data":"378bcda5d5b866811e18be7354d4796ab14daed12b283b9b2dc2169c0c96cd66"} Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.995054 4697 scope.go:117] "RemoveContainer" containerID="0e97746652449e49babc04c164c4b7c1830dfe71eaf0caec732cd4c31ebffda4" Feb 20 16:52:15 crc kubenswrapper[4697]: I0220 16:52:15.995207 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b958d6fb8-lff5x" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.011151 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25dm9\" (UniqueName: \"kubernetes.io/projected/aeadc920-c9f4-4965-9b75-a233291a3f6a-kube-api-access-25dm9\") pod \"dnsmasq-dns-b5bbf5bf5-dr9md\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.023110 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76b77c89fc-t9rjg" event={"ID":"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0","Type":"ContainerStarted","Data":"a9e47f04a881adaa1ba12d69400b46c92222c1cdb086be05074df5ce57d86b25"} Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.023145 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76b77c89fc-t9rjg" event={"ID":"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0","Type":"ContainerStarted","Data":"26449a8f212863328db6b2ebad7aa6edcb5fb8d96e36c0502b731be134f0d05c"} Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.025482 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.240165394 podStartE2EDuration="1m9.025464201s" podCreationTimestamp="2026-02-20 16:51:07 +0000 UTC" firstStartedPulling="2026-02-20 16:51:10.381267771 +0000 UTC m=+1178.161313179" lastFinishedPulling="2026-02-20 16:52:15.166566578 +0000 UTC m=+1242.946611986" observedRunningTime="2026-02-20 16:52:15.998563171 +0000 UTC m=+1243.778608579" watchObservedRunningTime="2026-02-20 16:52:16.025464201 +0000 UTC m=+1243.805509609" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.059972 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5b958d6fb8-lff5x"] Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.062376 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.066229 4697 scope.go:117] "RemoveContainer" containerID="f0adea8f73186007ac9ec77b593d4d32f1ff9a574813be46e3e427521776984c" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.066952 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-679c794b85-nbtsc" event={"ID":"38688b72-7aff-4efe-988f-aa147e5865e2","Type":"ContainerDied","Data":"4eb7122a0a2b0d83cbee73f1b44364376d2a055ee0743433625411cb8984f051"} Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.067022 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-679c794b85-nbtsc" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.098827 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.098886 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eaf42b1-80b0-43d3-8820-0b17637d46a6-logs\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.098920 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7eaf42b1-80b0-43d3-8820-0b17637d46a6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.098989 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pg6l\" (UniqueName: \"kubernetes.io/projected/7eaf42b1-80b0-43d3-8820-0b17637d46a6-kube-api-access-5pg6l\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.099016 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-config-data\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.099056 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-config-data-custom\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.099089 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-scripts\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.101226 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7eaf42b1-80b0-43d3-8820-0b17637d46a6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.102120 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-scripts\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.105586 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-config-data\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.108115 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eaf42b1-80b0-43d3-8820-0b17637d46a6-logs\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.113505 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-config-data-custom\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.116492 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.122939 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5b958d6fb8-lff5x"] Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.127607 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pg6l\" (UniqueName: \"kubernetes.io/projected/7eaf42b1-80b0-43d3-8820-0b17637d46a6-kube-api-access-5pg6l\") pod \"cinder-api-0\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.269024 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.295796 4697 scope.go:117] "RemoveContainer" containerID="db9c2ca261106d6c4e7feae41bab1d47736f748ad72e69b7693169273a9162f7" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.316731 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-679c794b85-nbtsc"] Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.321230 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-679c794b85-nbtsc"] Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.551771 4697 scope.go:117] "RemoveContainer" containerID="5c904e098eff1f50369dbb1b6e8a680e4219dcb312235c69580d11495a620313" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.623649 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.742987 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b5bbf5bf5-dr9md"] Feb 20 16:52:16 crc kubenswrapper[4697]: W0220 16:52:16.831934 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeadc920_c9f4_4965_9b75_a233291a3f6a.slice/crio-4a5fa885183ff64ba1f284e4c73bf3c9c0dcac4e7a8d66cbe2786c3a01962996 WatchSource:0}: Error finding container 4a5fa885183ff64ba1f284e4c73bf3c9c0dcac4e7a8d66cbe2786c3a01962996: Status 404 returned error can't find the container with id 4a5fa885183ff64ba1f284e4c73bf3c9c0dcac4e7a8d66cbe2786c3a01962996 Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.888309 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21ea8664-e35f-40a2-9f86-391033cfc7bd" path="/var/lib/kubelet/pods/21ea8664-e35f-40a2-9f86-391033cfc7bd/volumes" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.891341 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38688b72-7aff-4efe-988f-aa147e5865e2" path="/var/lib/kubelet/pods/38688b72-7aff-4efe-988f-aa147e5865e2/volumes" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.892221 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74cab817-1ae2-4488-9a46-1377bf50d579" path="/var/lib/kubelet/pods/74cab817-1ae2-4488-9a46-1377bf50d579/volumes" Feb 20 16:52:16 crc kubenswrapper[4697]: I0220 16:52:16.898989 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3577c39-e7f5-4def-97f1-8e68e4bc07dd" path="/var/lib/kubelet/pods/b3577c39-e7f5-4def-97f1-8e68e4bc07dd/volumes" Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.001609 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.109428 4697 generic.go:334] "Generic (PLEG): container finished" podID="b988e030-0e85-401e-bd22-057f9c2de43d" containerID="ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452" exitCode=0 Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.109472 4697 generic.go:334] "Generic (PLEG): container finished" podID="b988e030-0e85-401e-bd22-057f9c2de43d" containerID="8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e" exitCode=2 Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.109481 4697 generic.go:334] "Generic (PLEG): container finished" podID="b988e030-0e85-401e-bd22-057f9c2de43d" containerID="deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04" exitCode=0 Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.109523 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b988e030-0e85-401e-bd22-057f9c2de43d","Type":"ContainerDied","Data":"ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452"} Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.109549 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b988e030-0e85-401e-bd22-057f9c2de43d","Type":"ContainerDied","Data":"8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e"} Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.109560 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b988e030-0e85-401e-bd22-057f9c2de43d","Type":"ContainerDied","Data":"deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04"} Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.112535 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76b77c89fc-t9rjg" event={"ID":"0e0dd99f-4186-47a0-b6ba-c8d4abd040b0","Type":"ContainerStarted","Data":"6866fd4c85cbd7115b5b2bd4ceeaf563038473a1adbe6781100ffd54a2a65c82"} Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.112710 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.120365 4697 generic.go:334] "Generic (PLEG): container finished" podID="aeadc920-c9f4-4965-9b75-a233291a3f6a" containerID="48333e517f570ff008ac05ba32b933cc91a12ae29e807f9d64bab4ee382e8b83" exitCode=0 Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.120416 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" event={"ID":"aeadc920-c9f4-4965-9b75-a233291a3f6a","Type":"ContainerDied","Data":"48333e517f570ff008ac05ba32b933cc91a12ae29e807f9d64bab4ee382e8b83"} Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.120449 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" event={"ID":"aeadc920-c9f4-4965-9b75-a233291a3f6a","Type":"ContainerStarted","Data":"4a5fa885183ff64ba1f284e4c73bf3c9c0dcac4e7a8d66cbe2786c3a01962996"} Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.124849 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a8af28c-1b6b-4137-9d89-439c6f66e980","Type":"ContainerStarted","Data":"96b4f4328109839f8a6b4c38788aec43c562377e03e1a734d327f0205dc94cc3"} Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.126229 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bf9cb54f8-bfxkg" podUID="d6667b0d-626d-4578-9767-2026d21b1583" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.167:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.167:8443: connect: connection refused" Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.140423 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76b77c89fc-t9rjg" podStartSLOduration=10.14040331 podStartE2EDuration="10.14040331s" podCreationTimestamp="2026-02-20 16:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:17.127690348 +0000 UTC m=+1244.907735756" watchObservedRunningTime="2026-02-20 16:52:17.14040331 +0000 UTC m=+1244.920448718" Feb 20 16:52:17 crc kubenswrapper[4697]: I0220 16:52:17.146565 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7eaf42b1-80b0-43d3-8820-0b17637d46a6","Type":"ContainerStarted","Data":"4d8b7ff22e776c97ce5a9c99a456a0588c35f1d90c535e45641566069dc332e5"} Feb 20 16:52:18 crc kubenswrapper[4697]: I0220 16:52:18.166267 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 16:52:18 crc kubenswrapper[4697]: I0220 16:52:18.192639 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" event={"ID":"aeadc920-c9f4-4965-9b75-a233291a3f6a","Type":"ContainerStarted","Data":"862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3"} Feb 20 16:52:18 crc kubenswrapper[4697]: I0220 16:52:18.192932 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:18 crc kubenswrapper[4697]: I0220 16:52:18.219585 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" podStartSLOduration=3.219566162 podStartE2EDuration="3.219566162s" podCreationTimestamp="2026-02-20 16:52:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:18.20878011 +0000 UTC m=+1245.988825518" watchObservedRunningTime="2026-02-20 16:52:18.219566162 +0000 UTC m=+1245.999611570" Feb 20 16:52:18 crc kubenswrapper[4697]: I0220 16:52:18.228801 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a8af28c-1b6b-4137-9d89-439c6f66e980","Type":"ContainerStarted","Data":"cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b"} Feb 20 16:52:18 crc kubenswrapper[4697]: I0220 16:52:18.231182 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7eaf42b1-80b0-43d3-8820-0b17637d46a6","Type":"ContainerStarted","Data":"a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca"} Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.086021 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.184771 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b988e030-0e85-401e-bd22-057f9c2de43d-log-httpd\") pod \"b988e030-0e85-401e-bd22-057f9c2de43d\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.184840 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-combined-ca-bundle\") pod \"b988e030-0e85-401e-bd22-057f9c2de43d\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.184871 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b988e030-0e85-401e-bd22-057f9c2de43d-run-httpd\") pod \"b988e030-0e85-401e-bd22-057f9c2de43d\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.184899 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-sg-core-conf-yaml\") pod \"b988e030-0e85-401e-bd22-057f9c2de43d\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.184971 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-scripts\") pod \"b988e030-0e85-401e-bd22-057f9c2de43d\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.185028 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drptr\" (UniqueName: \"kubernetes.io/projected/b988e030-0e85-401e-bd22-057f9c2de43d-kube-api-access-drptr\") pod \"b988e030-0e85-401e-bd22-057f9c2de43d\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.185053 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-config-data\") pod \"b988e030-0e85-401e-bd22-057f9c2de43d\" (UID: \"b988e030-0e85-401e-bd22-057f9c2de43d\") " Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.187987 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b988e030-0e85-401e-bd22-057f9c2de43d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b988e030-0e85-401e-bd22-057f9c2de43d" (UID: "b988e030-0e85-401e-bd22-057f9c2de43d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.188499 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b988e030-0e85-401e-bd22-057f9c2de43d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b988e030-0e85-401e-bd22-057f9c2de43d" (UID: "b988e030-0e85-401e-bd22-057f9c2de43d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.195413 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-scripts" (OuterVolumeSpecName: "scripts") pod "b988e030-0e85-401e-bd22-057f9c2de43d" (UID: "b988e030-0e85-401e-bd22-057f9c2de43d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.212125 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b988e030-0e85-401e-bd22-057f9c2de43d-kube-api-access-drptr" (OuterVolumeSpecName: "kube-api-access-drptr") pod "b988e030-0e85-401e-bd22-057f9c2de43d" (UID: "b988e030-0e85-401e-bd22-057f9c2de43d"). InnerVolumeSpecName "kube-api-access-drptr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.227217 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b988e030-0e85-401e-bd22-057f9c2de43d" (UID: "b988e030-0e85-401e-bd22-057f9c2de43d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.246857 4697 generic.go:334] "Generic (PLEG): container finished" podID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerID="12cfced469d6bd5f7a254e88c6a9df5e4c3628cc075e1a01ed3fd25baff46ef4" exitCode=1 Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.246918 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e","Type":"ContainerDied","Data":"12cfced469d6bd5f7a254e88c6a9df5e4c3628cc075e1a01ed3fd25baff46ef4"} Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.246951 4697 scope.go:117] "RemoveContainer" containerID="bf201182bdd16e8f323bc9095059948530ad616d7108a16c235e3db97ee1f97f" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.247603 4697 scope.go:117] "RemoveContainer" containerID="12cfced469d6bd5f7a254e88c6a9df5e4c3628cc075e1a01ed3fd25baff46ef4" Feb 20 16:52:19 crc kubenswrapper[4697]: E0220 16:52:19.247893 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c9c1a3da-b8c3-4825-ba1e-c6bbecff953e)\"" pod="openstack/watcher-decision-engine-0" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.252848 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a8af28c-1b6b-4137-9d89-439c6f66e980","Type":"ContainerStarted","Data":"9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f"} Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.259250 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7eaf42b1-80b0-43d3-8820-0b17637d46a6","Type":"ContainerStarted","Data":"cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652"} Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.259510 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7eaf42b1-80b0-43d3-8820-0b17637d46a6" containerName="cinder-api-log" containerID="cri-o://a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca" gracePeriod=30 Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.259610 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.259646 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7eaf42b1-80b0-43d3-8820-0b17637d46a6" containerName="cinder-api" containerID="cri-o://cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652" gracePeriod=30 Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.272176 4697 generic.go:334] "Generic (PLEG): container finished" podID="b988e030-0e85-401e-bd22-057f9c2de43d" containerID="34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be" exitCode=0 Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.273003 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.273172 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b988e030-0e85-401e-bd22-057f9c2de43d","Type":"ContainerDied","Data":"34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be"} Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.273193 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b988e030-0e85-401e-bd22-057f9c2de43d","Type":"ContainerDied","Data":"322cdfcd049df9e9090d84e0fbfeda566988bf1a2b7d712d72106dc88bc2c126"} Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.289572 4697 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b988e030-0e85-401e-bd22-057f9c2de43d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.289600 4697 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b988e030-0e85-401e-bd22-057f9c2de43d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.289609 4697 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.289618 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.289626 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drptr\" (UniqueName: \"kubernetes.io/projected/b988e030-0e85-401e-bd22-057f9c2de43d-kube-api-access-drptr\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.302556 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.064129683 podStartE2EDuration="4.302504042s" podCreationTimestamp="2026-02-20 16:52:15 +0000 UTC" firstStartedPulling="2026-02-20 16:52:16.635770617 +0000 UTC m=+1244.415816015" lastFinishedPulling="2026-02-20 16:52:16.874144946 +0000 UTC m=+1244.654190374" observedRunningTime="2026-02-20 16:52:19.283825109 +0000 UTC m=+1247.063870517" watchObservedRunningTime="2026-02-20 16:52:19.302504042 +0000 UTC m=+1247.082549450" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.317789 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.317771208 podStartE2EDuration="4.317771208s" podCreationTimestamp="2026-02-20 16:52:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:19.300925002 +0000 UTC m=+1247.080970410" watchObservedRunningTime="2026-02-20 16:52:19.317771208 +0000 UTC m=+1247.097816616" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.358595 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b988e030-0e85-401e-bd22-057f9c2de43d" (UID: "b988e030-0e85-401e-bd22-057f9c2de43d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.392307 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.412202 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-config-data" (OuterVolumeSpecName: "config-data") pod "b988e030-0e85-401e-bd22-057f9c2de43d" (UID: "b988e030-0e85-401e-bd22-057f9c2de43d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.471499 4697 scope.go:117] "RemoveContainer" containerID="ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.491836 4697 scope.go:117] "RemoveContainer" containerID="8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.493663 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b988e030-0e85-401e-bd22-057f9c2de43d-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.521156 4697 scope.go:117] "RemoveContainer" containerID="34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.543139 4697 scope.go:117] "RemoveContainer" containerID="deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.565705 4697 scope.go:117] "RemoveContainer" containerID="ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452" Feb 20 16:52:19 crc kubenswrapper[4697]: E0220 16:52:19.566253 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452\": container with ID starting with ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452 not found: ID does not exist" containerID="ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.566292 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452"} err="failed to get container status \"ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452\": rpc error: code = NotFound desc = could not find container \"ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452\": container with ID starting with ebfe2ffc3e096f5e7a1dcadd01db48622ac0ec6c4a4f9090e3037b5871d8d452 not found: ID does not exist" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.566346 4697 scope.go:117] "RemoveContainer" containerID="8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e" Feb 20 16:52:19 crc kubenswrapper[4697]: E0220 16:52:19.566735 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e\": container with ID starting with 8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e not found: ID does not exist" containerID="8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.566768 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e"} err="failed to get container status \"8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e\": rpc error: code = NotFound desc = could not find container \"8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e\": container with ID starting with 8e9cdd010eb8e6e002a496b3a2f63d02237f047aac69e3250548f1de546c8e1e not found: ID does not exist" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.566791 4697 scope.go:117] "RemoveContainer" containerID="34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be" Feb 20 16:52:19 crc kubenswrapper[4697]: E0220 16:52:19.567027 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be\": container with ID starting with 34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be not found: ID does not exist" containerID="34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.567059 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be"} err="failed to get container status \"34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be\": rpc error: code = NotFound desc = could not find container \"34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be\": container with ID starting with 34696f296edc027d0eef639fa8f9520e2e4c5346148b8abab5523896d0c743be not found: ID does not exist" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.567075 4697 scope.go:117] "RemoveContainer" containerID="deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04" Feb 20 16:52:19 crc kubenswrapper[4697]: E0220 16:52:19.567277 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04\": container with ID starting with deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04 not found: ID does not exist" containerID="deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.567305 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04"} err="failed to get container status \"deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04\": rpc error: code = NotFound desc = could not find container \"deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04\": container with ID starting with deadb8b7b291e565e71580b89ffc5b06b3f13e2c62fe51383e80656396813d04 not found: ID does not exist" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.606762 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.614379 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.624869 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:19 crc kubenswrapper[4697]: E0220 16:52:19.626219 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="sg-core" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.626237 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="sg-core" Feb 20 16:52:19 crc kubenswrapper[4697]: E0220 16:52:19.626250 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="ceilometer-central-agent" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.626257 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="ceilometer-central-agent" Feb 20 16:52:19 crc kubenswrapper[4697]: E0220 16:52:19.626270 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="ceilometer-notification-agent" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.626276 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="ceilometer-notification-agent" Feb 20 16:52:19 crc kubenswrapper[4697]: E0220 16:52:19.626296 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="proxy-httpd" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.626302 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="proxy-httpd" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.626555 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="proxy-httpd" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.626568 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="ceilometer-central-agent" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.626583 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="sg-core" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.626593 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" containerName="ceilometer-notification-agent" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.628604 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.636346 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.636614 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.636662 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.696863 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-config-data\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.697124 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.697211 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.697296 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-scripts\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.697372 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwnkf\" (UniqueName: \"kubernetes.io/projected/671e5d87-8407-4686-93f7-1786cc51ab43-kube-api-access-nwnkf\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.697532 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/671e5d87-8407-4686-93f7-1786cc51ab43-log-httpd\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.697611 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/671e5d87-8407-4686-93f7-1786cc51ab43-run-httpd\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.721619 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.721931 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.799680 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-config-data\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.800307 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.800469 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.800620 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-scripts\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.800750 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwnkf\" (UniqueName: \"kubernetes.io/projected/671e5d87-8407-4686-93f7-1786cc51ab43-kube-api-access-nwnkf\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.800875 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/671e5d87-8407-4686-93f7-1786cc51ab43-log-httpd\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.800979 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/671e5d87-8407-4686-93f7-1786cc51ab43-run-httpd\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.801565 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/671e5d87-8407-4686-93f7-1786cc51ab43-run-httpd\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.801671 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/671e5d87-8407-4686-93f7-1786cc51ab43-log-httpd\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.805771 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-config-data\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.807120 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.807332 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-scripts\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.810269 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.822814 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwnkf\" (UniqueName: \"kubernetes.io/projected/671e5d87-8407-4686-93f7-1786cc51ab43-kube-api-access-nwnkf\") pod \"ceilometer-0\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " pod="openstack/ceilometer-0" Feb 20 16:52:19 crc kubenswrapper[4697]: I0220 16:52:19.953839 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:52:20 crc kubenswrapper[4697]: I0220 16:52:20.287654 4697 scope.go:117] "RemoveContainer" containerID="12cfced469d6bd5f7a254e88c6a9df5e4c3628cc075e1a01ed3fd25baff46ef4" Feb 20 16:52:20 crc kubenswrapper[4697]: E0220 16:52:20.288295 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c9c1a3da-b8c3-4825-ba1e-c6bbecff953e)\"" pod="openstack/watcher-decision-engine-0" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" Feb 20 16:52:20 crc kubenswrapper[4697]: I0220 16:52:20.290087 4697 generic.go:334] "Generic (PLEG): container finished" podID="7eaf42b1-80b0-43d3-8820-0b17637d46a6" containerID="a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca" exitCode=143 Feb 20 16:52:20 crc kubenswrapper[4697]: I0220 16:52:20.290186 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7eaf42b1-80b0-43d3-8820-0b17637d46a6","Type":"ContainerDied","Data":"a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca"} Feb 20 16:52:20 crc kubenswrapper[4697]: I0220 16:52:20.422010 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:20 crc kubenswrapper[4697]: W0220 16:52:20.427741 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod671e5d87_8407_4686_93f7_1786cc51ab43.slice/crio-b4650d0d9a7354ea535ee578eb0f168ae59902ef38278f1716c5bfca7170494b WatchSource:0}: Error finding container b4650d0d9a7354ea535ee578eb0f168ae59902ef38278f1716c5bfca7170494b: Status 404 returned error can't find the container with id b4650d0d9a7354ea535ee578eb0f168ae59902ef38278f1716c5bfca7170494b Feb 20 16:52:20 crc kubenswrapper[4697]: I0220 16:52:20.887623 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b988e030-0e85-401e-bd22-057f9c2de43d" path="/var/lib/kubelet/pods/b988e030-0e85-401e-bd22-057f9c2de43d/volumes" Feb 20 16:52:20 crc kubenswrapper[4697]: I0220 16:52:20.973041 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.300596 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"671e5d87-8407-4686-93f7-1786cc51ab43","Type":"ContainerStarted","Data":"c348ac1ab4df35692172bcd20cd3859edb1839a4ecae2755cad96f2f6e751521"} Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.300920 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"671e5d87-8407-4686-93f7-1786cc51ab43","Type":"ContainerStarted","Data":"f2a294635ef7a9d048fae0cbad91d12928dc12ae556ca28dc4aec4a4124e5e4f"} Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.300943 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"671e5d87-8407-4686-93f7-1786cc51ab43","Type":"ContainerStarted","Data":"b4650d0d9a7354ea535ee578eb0f168ae59902ef38278f1716c5bfca7170494b"} Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.301447 4697 scope.go:117] "RemoveContainer" containerID="12cfced469d6bd5f7a254e88c6a9df5e4c3628cc075e1a01ed3fd25baff46ef4" Feb 20 16:52:21 crc kubenswrapper[4697]: E0220 16:52:21.301675 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c9c1a3da-b8c3-4825-ba1e-c6bbecff953e)\"" pod="openstack/watcher-decision-engine-0" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.532457 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.533474 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.786886 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-57d8cdd7b4-pxpls"] Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.788650 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.790291 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57d8cdd7b4-pxpls"] Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.969785 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmj68\" (UniqueName: \"kubernetes.io/projected/8cecab8a-e2db-476f-89d1-b28b4f585d57-kube-api-access-vmj68\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.969876 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-config-data\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.969948 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-combined-ca-bundle\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.970037 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cecab8a-e2db-476f-89d1-b28b4f585d57-logs\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.970121 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-scripts\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.970151 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-internal-tls-certs\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:21 crc kubenswrapper[4697]: I0220 16:52:21.970254 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-public-tls-certs\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.072338 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-public-tls-certs\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.073278 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmj68\" (UniqueName: \"kubernetes.io/projected/8cecab8a-e2db-476f-89d1-b28b4f585d57-kube-api-access-vmj68\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.073589 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-config-data\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.073666 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-combined-ca-bundle\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.073771 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cecab8a-e2db-476f-89d1-b28b4f585d57-logs\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.073808 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-scripts\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.073830 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-internal-tls-certs\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.074192 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cecab8a-e2db-476f-89d1-b28b4f585d57-logs\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.080937 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-internal-tls-certs\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.081224 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-public-tls-certs\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.085012 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-config-data\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.088699 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-combined-ca-bundle\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.088862 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmj68\" (UniqueName: \"kubernetes.io/projected/8cecab8a-e2db-476f-89d1-b28b4f585d57-kube-api-access-vmj68\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.095561 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cecab8a-e2db-476f-89d1-b28b4f585d57-scripts\") pod \"placement-57d8cdd7b4-pxpls\" (UID: \"8cecab8a-e2db-476f-89d1-b28b4f585d57\") " pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.178096 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.328808 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"671e5d87-8407-4686-93f7-1786cc51ab43","Type":"ContainerStarted","Data":"3832968f98762b56f14b83425d542dda9fb74b2de3c2936c0a022909e7bb466b"} Feb 20 16:52:22 crc kubenswrapper[4697]: I0220 16:52:22.765414 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57d8cdd7b4-pxpls"] Feb 20 16:52:23 crc kubenswrapper[4697]: I0220 16:52:23.345055 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57d8cdd7b4-pxpls" event={"ID":"8cecab8a-e2db-476f-89d1-b28b4f585d57","Type":"ContainerStarted","Data":"e42b7745a90617762e6875d05b6c108e013938927e95d4c367998f5686de2f1f"} Feb 20 16:52:23 crc kubenswrapper[4697]: I0220 16:52:23.345640 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57d8cdd7b4-pxpls" event={"ID":"8cecab8a-e2db-476f-89d1-b28b4f585d57","Type":"ContainerStarted","Data":"acb05dd3b2fe46798b40dfe21c5bc4a30616b833f66788cfc10b91bf9181ce15"} Feb 20 16:52:24 crc kubenswrapper[4697]: I0220 16:52:24.362064 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57d8cdd7b4-pxpls" event={"ID":"8cecab8a-e2db-476f-89d1-b28b4f585d57","Type":"ContainerStarted","Data":"f4dff59856a883d787816a339c667e7a4a263693fe20b879741142e5e16f36f1"} Feb 20 16:52:24 crc kubenswrapper[4697]: I0220 16:52:24.363471 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:24 crc kubenswrapper[4697]: I0220 16:52:24.364356 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:24 crc kubenswrapper[4697]: I0220 16:52:24.365027 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"671e5d87-8407-4686-93f7-1786cc51ab43","Type":"ContainerStarted","Data":"b3fe80799d3838360c1db67ef5874ba4b2074bf0aff1730e8ff615eed9e39ef4"} Feb 20 16:52:24 crc kubenswrapper[4697]: I0220 16:52:24.365261 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 16:52:24 crc kubenswrapper[4697]: I0220 16:52:24.423995 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-57d8cdd7b4-pxpls" podStartSLOduration=3.423973011 podStartE2EDuration="3.423973011s" podCreationTimestamp="2026-02-20 16:52:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:24.389458048 +0000 UTC m=+1252.169503476" watchObservedRunningTime="2026-02-20 16:52:24.423973011 +0000 UTC m=+1252.204018409" Feb 20 16:52:24 crc kubenswrapper[4697]: I0220 16:52:24.428127 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.52728513 podStartE2EDuration="5.428117706s" podCreationTimestamp="2026-02-20 16:52:19 +0000 UTC" firstStartedPulling="2026-02-20 16:52:20.430205203 +0000 UTC m=+1248.210250611" lastFinishedPulling="2026-02-20 16:52:23.331037779 +0000 UTC m=+1251.111083187" observedRunningTime="2026-02-20 16:52:24.414405819 +0000 UTC m=+1252.194451237" watchObservedRunningTime="2026-02-20 16:52:24.428117706 +0000 UTC m=+1252.208163104" Feb 20 16:52:25 crc kubenswrapper[4697]: I0220 16:52:25.141944 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-85d8b4ccc6-tdklm" Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.064769 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.133709 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.136785 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58ccbc4c65-26lj7"] Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.137016 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" podUID="388a02e2-0a16-4e6b-9c80-f72cad62a370" containerName="dnsmasq-dns" containerID="cri-o://55dcde7167e76b063abb1cba16c1428e8a382dcdc027781fbc3cb5bd5b5fc9d4" gracePeriod=10 Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.228465 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.386063 4697 generic.go:334] "Generic (PLEG): container finished" podID="388a02e2-0a16-4e6b-9c80-f72cad62a370" containerID="55dcde7167e76b063abb1cba16c1428e8a382dcdc027781fbc3cb5bd5b5fc9d4" exitCode=0 Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.386102 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" event={"ID":"388a02e2-0a16-4e6b-9c80-f72cad62a370","Type":"ContainerDied","Data":"55dcde7167e76b063abb1cba16c1428e8a382dcdc027781fbc3cb5bd5b5fc9d4"} Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.386293 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7a8af28c-1b6b-4137-9d89-439c6f66e980" containerName="cinder-scheduler" containerID="cri-o://cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b" gracePeriod=30 Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.386422 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7a8af28c-1b6b-4137-9d89-439c6f66e980" containerName="probe" containerID="cri-o://9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f" gracePeriod=30 Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.774795 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.927950 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-ovsdbserver-nb\") pod \"388a02e2-0a16-4e6b-9c80-f72cad62a370\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.928116 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fmsh\" (UniqueName: \"kubernetes.io/projected/388a02e2-0a16-4e6b-9c80-f72cad62a370-kube-api-access-6fmsh\") pod \"388a02e2-0a16-4e6b-9c80-f72cad62a370\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.928151 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-config\") pod \"388a02e2-0a16-4e6b-9c80-f72cad62a370\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.928193 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-dns-swift-storage-0\") pod \"388a02e2-0a16-4e6b-9c80-f72cad62a370\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.928246 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-dns-svc\") pod \"388a02e2-0a16-4e6b-9c80-f72cad62a370\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.928269 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-ovsdbserver-sb\") pod \"388a02e2-0a16-4e6b-9c80-f72cad62a370\" (UID: \"388a02e2-0a16-4e6b-9c80-f72cad62a370\") " Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.937768 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388a02e2-0a16-4e6b-9c80-f72cad62a370-kube-api-access-6fmsh" (OuterVolumeSpecName: "kube-api-access-6fmsh") pod "388a02e2-0a16-4e6b-9c80-f72cad62a370" (UID: "388a02e2-0a16-4e6b-9c80-f72cad62a370"). InnerVolumeSpecName "kube-api-access-6fmsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:26 crc kubenswrapper[4697]: I0220 16:52:26.993267 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "388a02e2-0a16-4e6b-9c80-f72cad62a370" (UID: "388a02e2-0a16-4e6b-9c80-f72cad62a370"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.018210 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "388a02e2-0a16-4e6b-9c80-f72cad62a370" (UID: "388a02e2-0a16-4e6b-9c80-f72cad62a370"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.030059 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.030098 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.030111 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fmsh\" (UniqueName: \"kubernetes.io/projected/388a02e2-0a16-4e6b-9c80-f72cad62a370-kube-api-access-6fmsh\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.030098 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "388a02e2-0a16-4e6b-9c80-f72cad62a370" (UID: "388a02e2-0a16-4e6b-9c80-f72cad62a370"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.031405 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "388a02e2-0a16-4e6b-9c80-f72cad62a370" (UID: "388a02e2-0a16-4e6b-9c80-f72cad62a370"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.045997 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-config" (OuterVolumeSpecName: "config") pod "388a02e2-0a16-4e6b-9c80-f72cad62a370" (UID: "388a02e2-0a16-4e6b-9c80-f72cad62a370"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.126645 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bf9cb54f8-bfxkg" podUID="d6667b0d-626d-4578-9767-2026d21b1583" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.167:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.167:8443: connect: connection refused" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.131331 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.131368 4697 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.131379 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/388a02e2-0a16-4e6b-9c80-f72cad62a370-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.395774 4697 generic.go:334] "Generic (PLEG): container finished" podID="7a8af28c-1b6b-4137-9d89-439c6f66e980" containerID="9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f" exitCode=0 Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.395830 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a8af28c-1b6b-4137-9d89-439c6f66e980","Type":"ContainerDied","Data":"9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f"} Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.397958 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" event={"ID":"388a02e2-0a16-4e6b-9c80-f72cad62a370","Type":"ContainerDied","Data":"933e9552c801db7385d682e96b8c03d0705443c0741dca666c9fe383c1a12c5d"} Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.398017 4697 scope.go:117] "RemoveContainer" containerID="55dcde7167e76b063abb1cba16c1428e8a382dcdc027781fbc3cb5bd5b5fc9d4" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.398129 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58ccbc4c65-26lj7" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.418619 4697 scope.go:117] "RemoveContainer" containerID="7dec8c6b5ce098c88dfb4b604bc2234243247b64c9240daf2f3bd547e070e8a1" Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.438353 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58ccbc4c65-26lj7"] Feb 20 16:52:27 crc kubenswrapper[4697]: I0220 16:52:27.474114 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58ccbc4c65-26lj7"] Feb 20 16:52:28 crc kubenswrapper[4697]: I0220 16:52:28.891251 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388a02e2-0a16-4e6b-9c80-f72cad62a370" path="/var/lib/kubelet/pods/388a02e2-0a16-4e6b-9c80-f72cad62a370/volumes" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.187412 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.667874 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 20 16:52:29 crc kubenswrapper[4697]: E0220 16:52:29.668291 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388a02e2-0a16-4e6b-9c80-f72cad62a370" containerName="init" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.668313 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="388a02e2-0a16-4e6b-9c80-f72cad62a370" containerName="init" Feb 20 16:52:29 crc kubenswrapper[4697]: E0220 16:52:29.668322 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388a02e2-0a16-4e6b-9c80-f72cad62a370" containerName="dnsmasq-dns" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.668332 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="388a02e2-0a16-4e6b-9c80-f72cad62a370" containerName="dnsmasq-dns" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.668599 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="388a02e2-0a16-4e6b-9c80-f72cad62a370" containerName="dnsmasq-dns" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.669380 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.671759 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.673662 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zqjsn" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.675280 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.680726 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.721810 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.721848 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.722589 4697 scope.go:117] "RemoveContainer" containerID="12cfced469d6bd5f7a254e88c6a9df5e4c3628cc075e1a01ed3fd25baff46ef4" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.799713 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/46353336-32b2-48dd-b998-667e36a1cf93-openstack-config\") pod \"openstackclient\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.800257 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmrhf\" (UniqueName: \"kubernetes.io/projected/46353336-32b2-48dd-b998-667e36a1cf93-kube-api-access-wmrhf\") pod \"openstackclient\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.800304 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/46353336-32b2-48dd-b998-667e36a1cf93-openstack-config-secret\") pod \"openstackclient\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.800418 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46353336-32b2-48dd-b998-667e36a1cf93-combined-ca-bundle\") pod \"openstackclient\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.902701 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmrhf\" (UniqueName: \"kubernetes.io/projected/46353336-32b2-48dd-b998-667e36a1cf93-kube-api-access-wmrhf\") pod \"openstackclient\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.902772 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/46353336-32b2-48dd-b998-667e36a1cf93-openstack-config-secret\") pod \"openstackclient\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.902865 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46353336-32b2-48dd-b998-667e36a1cf93-combined-ca-bundle\") pod \"openstackclient\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.903006 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/46353336-32b2-48dd-b998-667e36a1cf93-openstack-config\") pod \"openstackclient\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.903878 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/46353336-32b2-48dd-b998-667e36a1cf93-openstack-config\") pod \"openstackclient\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.909142 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/46353336-32b2-48dd-b998-667e36a1cf93-openstack-config-secret\") pod \"openstackclient\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.916759 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46353336-32b2-48dd-b998-667e36a1cf93-combined-ca-bundle\") pod \"openstackclient\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.931087 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmrhf\" (UniqueName: \"kubernetes.io/projected/46353336-32b2-48dd-b998-667e36a1cf93-kube-api-access-wmrhf\") pod \"openstackclient\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " pod="openstack/openstackclient" Feb 20 16:52:29 crc kubenswrapper[4697]: I0220 16:52:29.989559 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.028080 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.040892 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.063292 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.066032 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.072826 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 16:52:30 crc kubenswrapper[4697]: E0220 16:52:30.109484 4697 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 20 16:52:30 crc kubenswrapper[4697]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_46353336-32b2-48dd-b998-667e36a1cf93_0(7366c827288de1fcf3ef4c64d5c45e9cf6dd8f576951ed584dad9a5b18d2b93d): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7366c827288de1fcf3ef4c64d5c45e9cf6dd8f576951ed584dad9a5b18d2b93d" Netns:"/var/run/netns/909dab07-d49a-4585-ab8b-92cf56dd251f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=7366c827288de1fcf3ef4c64d5c45e9cf6dd8f576951ed584dad9a5b18d2b93d;K8S_POD_UID=46353336-32b2-48dd-b998-667e36a1cf93" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/46353336-32b2-48dd-b998-667e36a1cf93]: expected pod UID "46353336-32b2-48dd-b998-667e36a1cf93" but got "8dae4cc2-1fb9-47ff-af11-854c15a884a3" from Kube API Feb 20 16:52:30 crc kubenswrapper[4697]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 20 16:52:30 crc kubenswrapper[4697]: > Feb 20 16:52:30 crc kubenswrapper[4697]: E0220 16:52:30.109753 4697 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 20 16:52:30 crc kubenswrapper[4697]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_46353336-32b2-48dd-b998-667e36a1cf93_0(7366c827288de1fcf3ef4c64d5c45e9cf6dd8f576951ed584dad9a5b18d2b93d): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7366c827288de1fcf3ef4c64d5c45e9cf6dd8f576951ed584dad9a5b18d2b93d" Netns:"/var/run/netns/909dab07-d49a-4585-ab8b-92cf56dd251f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=7366c827288de1fcf3ef4c64d5c45e9cf6dd8f576951ed584dad9a5b18d2b93d;K8S_POD_UID=46353336-32b2-48dd-b998-667e36a1cf93" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/46353336-32b2-48dd-b998-667e36a1cf93]: expected pod UID "46353336-32b2-48dd-b998-667e36a1cf93" but got "8dae4cc2-1fb9-47ff-af11-854c15a884a3" from Kube API Feb 20 16:52:30 crc kubenswrapper[4697]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 20 16:52:30 crc kubenswrapper[4697]: > pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.208637 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8dae4cc2-1fb9-47ff-af11-854c15a884a3-openstack-config-secret\") pod \"openstackclient\" (UID: \"8dae4cc2-1fb9-47ff-af11-854c15a884a3\") " pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.208684 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dae4cc2-1fb9-47ff-af11-854c15a884a3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8dae4cc2-1fb9-47ff-af11-854c15a884a3\") " pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.208727 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8dae4cc2-1fb9-47ff-af11-854c15a884a3-openstack-config\") pod \"openstackclient\" (UID: \"8dae4cc2-1fb9-47ff-af11-854c15a884a3\") " pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.209103 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blb5q\" (UniqueName: \"kubernetes.io/projected/8dae4cc2-1fb9-47ff-af11-854c15a884a3-kube-api-access-blb5q\") pod \"openstackclient\" (UID: \"8dae4cc2-1fb9-47ff-af11-854c15a884a3\") " pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.311376 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blb5q\" (UniqueName: \"kubernetes.io/projected/8dae4cc2-1fb9-47ff-af11-854c15a884a3-kube-api-access-blb5q\") pod \"openstackclient\" (UID: \"8dae4cc2-1fb9-47ff-af11-854c15a884a3\") " pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.311775 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8dae4cc2-1fb9-47ff-af11-854c15a884a3-openstack-config-secret\") pod \"openstackclient\" (UID: \"8dae4cc2-1fb9-47ff-af11-854c15a884a3\") " pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.311803 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dae4cc2-1fb9-47ff-af11-854c15a884a3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8dae4cc2-1fb9-47ff-af11-854c15a884a3\") " pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.311839 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8dae4cc2-1fb9-47ff-af11-854c15a884a3-openstack-config\") pod \"openstackclient\" (UID: \"8dae4cc2-1fb9-47ff-af11-854c15a884a3\") " pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.312987 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8dae4cc2-1fb9-47ff-af11-854c15a884a3-openstack-config\") pod \"openstackclient\" (UID: \"8dae4cc2-1fb9-47ff-af11-854c15a884a3\") " pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.318177 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8dae4cc2-1fb9-47ff-af11-854c15a884a3-openstack-config-secret\") pod \"openstackclient\" (UID: \"8dae4cc2-1fb9-47ff-af11-854c15a884a3\") " pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.319912 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dae4cc2-1fb9-47ff-af11-854c15a884a3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8dae4cc2-1fb9-47ff-af11-854c15a884a3\") " pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.327387 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blb5q\" (UniqueName: \"kubernetes.io/projected/8dae4cc2-1fb9-47ff-af11-854c15a884a3-kube-api-access-blb5q\") pod \"openstackclient\" (UID: \"8dae4cc2-1fb9-47ff-af11-854c15a884a3\") " pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.429070 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.431723 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.432514 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e","Type":"ContainerStarted","Data":"fa06df65039534b15c98e7d14bb3f265f51319d10a2d9eff5ef2e1bbd5bd7513"} Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.436942 4697 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="46353336-32b2-48dd-b998-667e36a1cf93" podUID="8dae4cc2-1fb9-47ff-af11-854c15a884a3" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.439982 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.616804 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/46353336-32b2-48dd-b998-667e36a1cf93-openstack-config\") pod \"46353336-32b2-48dd-b998-667e36a1cf93\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.617410 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46353336-32b2-48dd-b998-667e36a1cf93-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "46353336-32b2-48dd-b998-667e36a1cf93" (UID: "46353336-32b2-48dd-b998-667e36a1cf93"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.619097 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46353336-32b2-48dd-b998-667e36a1cf93-combined-ca-bundle\") pod \"46353336-32b2-48dd-b998-667e36a1cf93\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.619175 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmrhf\" (UniqueName: \"kubernetes.io/projected/46353336-32b2-48dd-b998-667e36a1cf93-kube-api-access-wmrhf\") pod \"46353336-32b2-48dd-b998-667e36a1cf93\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.619879 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/46353336-32b2-48dd-b998-667e36a1cf93-openstack-config-secret\") pod \"46353336-32b2-48dd-b998-667e36a1cf93\" (UID: \"46353336-32b2-48dd-b998-667e36a1cf93\") " Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.620837 4697 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/46353336-32b2-48dd-b998-667e36a1cf93-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.624993 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46353336-32b2-48dd-b998-667e36a1cf93-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "46353336-32b2-48dd-b998-667e36a1cf93" (UID: "46353336-32b2-48dd-b998-667e36a1cf93"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.625058 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46353336-32b2-48dd-b998-667e36a1cf93-kube-api-access-wmrhf" (OuterVolumeSpecName: "kube-api-access-wmrhf") pod "46353336-32b2-48dd-b998-667e36a1cf93" (UID: "46353336-32b2-48dd-b998-667e36a1cf93"). InnerVolumeSpecName "kube-api-access-wmrhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.626573 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46353336-32b2-48dd-b998-667e36a1cf93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46353336-32b2-48dd-b998-667e36a1cf93" (UID: "46353336-32b2-48dd-b998-667e36a1cf93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.723268 4697 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/46353336-32b2-48dd-b998-667e36a1cf93-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.723565 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46353336-32b2-48dd-b998-667e36a1cf93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.723576 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmrhf\" (UniqueName: \"kubernetes.io/projected/46353336-32b2-48dd-b998-667e36a1cf93-kube-api-access-wmrhf\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.908642 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46353336-32b2-48dd-b998-667e36a1cf93" path="/var/lib/kubelet/pods/46353336-32b2-48dd-b998-667e36a1cf93/volumes" Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.943345 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 16:52:30 crc kubenswrapper[4697]: I0220 16:52:30.985175 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.039451 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-combined-ca-bundle\") pod \"7a8af28c-1b6b-4137-9d89-439c6f66e980\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.039516 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-scripts\") pod \"7a8af28c-1b6b-4137-9d89-439c6f66e980\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.039538 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-config-data-custom\") pod \"7a8af28c-1b6b-4137-9d89-439c6f66e980\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.039620 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a8af28c-1b6b-4137-9d89-439c6f66e980-etc-machine-id\") pod \"7a8af28c-1b6b-4137-9d89-439c6f66e980\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.039780 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-config-data\") pod \"7a8af28c-1b6b-4137-9d89-439c6f66e980\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.039812 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8vrc\" (UniqueName: \"kubernetes.io/projected/7a8af28c-1b6b-4137-9d89-439c6f66e980-kube-api-access-k8vrc\") pod \"7a8af28c-1b6b-4137-9d89-439c6f66e980\" (UID: \"7a8af28c-1b6b-4137-9d89-439c6f66e980\") " Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.040510 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a8af28c-1b6b-4137-9d89-439c6f66e980-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7a8af28c-1b6b-4137-9d89-439c6f66e980" (UID: "7a8af28c-1b6b-4137-9d89-439c6f66e980"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.041064 4697 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a8af28c-1b6b-4137-9d89-439c6f66e980-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.050077 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-scripts" (OuterVolumeSpecName: "scripts") pod "7a8af28c-1b6b-4137-9d89-439c6f66e980" (UID: "7a8af28c-1b6b-4137-9d89-439c6f66e980"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.051574 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a8af28c-1b6b-4137-9d89-439c6f66e980" (UID: "7a8af28c-1b6b-4137-9d89-439c6f66e980"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.054585 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8af28c-1b6b-4137-9d89-439c6f66e980-kube-api-access-k8vrc" (OuterVolumeSpecName: "kube-api-access-k8vrc") pod "7a8af28c-1b6b-4137-9d89-439c6f66e980" (UID: "7a8af28c-1b6b-4137-9d89-439c6f66e980"). InnerVolumeSpecName "kube-api-access-k8vrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.122167 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a8af28c-1b6b-4137-9d89-439c6f66e980" (UID: "7a8af28c-1b6b-4137-9d89-439c6f66e980"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.143212 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.143246 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.143256 4697 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.143264 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8vrc\" (UniqueName: \"kubernetes.io/projected/7a8af28c-1b6b-4137-9d89-439c6f66e980-kube-api-access-k8vrc\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.166273 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-config-data" (OuterVolumeSpecName: "config-data") pod "7a8af28c-1b6b-4137-9d89-439c6f66e980" (UID: "7a8af28c-1b6b-4137-9d89-439c6f66e980"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.184912 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.184982 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.244964 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8af28c-1b6b-4137-9d89-439c6f66e980-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.449607 4697 generic.go:334] "Generic (PLEG): container finished" podID="7a8af28c-1b6b-4137-9d89-439c6f66e980" containerID="cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b" exitCode=0 Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.449672 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a8af28c-1b6b-4137-9d89-439c6f66e980","Type":"ContainerDied","Data":"cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b"} Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.449699 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a8af28c-1b6b-4137-9d89-439c6f66e980","Type":"ContainerDied","Data":"96b4f4328109839f8a6b4c38788aec43c562377e03e1a734d327f0205dc94cc3"} Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.449695 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.449720 4697 scope.go:117] "RemoveContainer" containerID="9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.452326 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.452356 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8dae4cc2-1fb9-47ff-af11-854c15a884a3","Type":"ContainerStarted","Data":"065f1384f3fb4d86f47e982c0e4993ecfe2a08b9aca60d334d5549f6c071a1c7"} Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.459526 4697 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="46353336-32b2-48dd-b998-667e36a1cf93" podUID="8dae4cc2-1fb9-47ff-af11-854c15a884a3" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.498653 4697 scope.go:117] "RemoveContainer" containerID="cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.499100 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.528324 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.573277 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 16:52:31 crc kubenswrapper[4697]: E0220 16:52:31.574027 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8af28c-1b6b-4137-9d89-439c6f66e980" containerName="cinder-scheduler" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.574050 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8af28c-1b6b-4137-9d89-439c6f66e980" containerName="cinder-scheduler" Feb 20 16:52:31 crc kubenswrapper[4697]: E0220 16:52:31.574080 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8af28c-1b6b-4137-9d89-439c6f66e980" containerName="probe" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.574088 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8af28c-1b6b-4137-9d89-439c6f66e980" containerName="probe" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.574328 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8af28c-1b6b-4137-9d89-439c6f66e980" containerName="probe" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.574354 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8af28c-1b6b-4137-9d89-439c6f66e980" containerName="cinder-scheduler" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.575794 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.578131 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.583728 4697 scope.go:117] "RemoveContainer" containerID="9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f" Feb 20 16:52:31 crc kubenswrapper[4697]: E0220 16:52:31.584199 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f\": container with ID starting with 9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f not found: ID does not exist" containerID="9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.584242 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f"} err="failed to get container status \"9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f\": rpc error: code = NotFound desc = could not find container \"9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f\": container with ID starting with 9641736777f60766a9e84516b909ac9faa8834449b43fe365443f9e189d4c27f not found: ID does not exist" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.584271 4697 scope.go:117] "RemoveContainer" containerID="cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b" Feb 20 16:52:31 crc kubenswrapper[4697]: E0220 16:52:31.584613 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b\": container with ID starting with cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b not found: ID does not exist" containerID="cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.584646 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b"} err="failed to get container status \"cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b\": rpc error: code = NotFound desc = could not find container \"cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b\": container with ID starting with cddc2956aefccdcc89875d653d9cbe6bfb47f1e2d8c63d029906855771d0154b not found: ID does not exist" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.587645 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.753535 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-config-data\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.753604 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.753647 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.753702 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-scripts\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.753753 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxps\" (UniqueName: \"kubernetes.io/projected/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-kube-api-access-8kxps\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.753781 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.855669 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-scripts\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.855760 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kxps\" (UniqueName: \"kubernetes.io/projected/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-kube-api-access-8kxps\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.855793 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.856221 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-config-data\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.856601 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.856639 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.857107 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.864315 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.864669 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-config-data\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.864931 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.874875 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-scripts\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.895828 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kxps\" (UniqueName: \"kubernetes.io/projected/8a927bbf-7945-44bd-9c0c-1f24b0af5b9a-kube-api-access-8kxps\") pod \"cinder-scheduler-0\" (UID: \"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a\") " pod="openstack/cinder-scheduler-0" Feb 20 16:52:31 crc kubenswrapper[4697]: I0220 16:52:31.902144 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 16:52:32 crc kubenswrapper[4697]: I0220 16:52:32.394390 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 16:52:32 crc kubenswrapper[4697]: W0220 16:52:32.402425 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a927bbf_7945_44bd_9c0c_1f24b0af5b9a.slice/crio-7a56637a72e2f74b965fd850c8d7eac80c8cfb4a3ebb148d130695cf39ac1330 WatchSource:0}: Error finding container 7a56637a72e2f74b965fd850c8d7eac80c8cfb4a3ebb148d130695cf39ac1330: Status 404 returned error can't find the container with id 7a56637a72e2f74b965fd850c8d7eac80c8cfb4a3ebb148d130695cf39ac1330 Feb 20 16:52:32 crc kubenswrapper[4697]: I0220 16:52:32.462842 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a","Type":"ContainerStarted","Data":"7a56637a72e2f74b965fd850c8d7eac80c8cfb4a3ebb148d130695cf39ac1330"} Feb 20 16:52:32 crc kubenswrapper[4697]: I0220 16:52:32.945194 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8af28c-1b6b-4137-9d89-439c6f66e980" path="/var/lib/kubelet/pods/7a8af28c-1b6b-4137-9d89-439c6f66e980/volumes" Feb 20 16:52:33 crc kubenswrapper[4697]: I0220 16:52:33.475825 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a","Type":"ContainerStarted","Data":"e1c397d1881263fcd42e1dff07de0cbe871aea1aaf767ac4ff8e08bdab461837"} Feb 20 16:52:33 crc kubenswrapper[4697]: I0220 16:52:33.863078 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:33 crc kubenswrapper[4697]: I0220 16:52:33.863363 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="ceilometer-central-agent" containerID="cri-o://f2a294635ef7a9d048fae0cbad91d12928dc12ae556ca28dc4aec4a4124e5e4f" gracePeriod=30 Feb 20 16:52:33 crc kubenswrapper[4697]: I0220 16:52:33.863807 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="ceilometer-notification-agent" containerID="cri-o://c348ac1ab4df35692172bcd20cd3859edb1839a4ecae2755cad96f2f6e751521" gracePeriod=30 Feb 20 16:52:33 crc kubenswrapper[4697]: I0220 16:52:33.863817 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="proxy-httpd" containerID="cri-o://b3fe80799d3838360c1db67ef5874ba4b2074bf0aff1730e8ff615eed9e39ef4" gracePeriod=30 Feb 20 16:52:33 crc kubenswrapper[4697]: I0220 16:52:33.863931 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="sg-core" containerID="cri-o://3832968f98762b56f14b83425d542dda9fb74b2de3c2936c0a022909e7bb466b" gracePeriod=30 Feb 20 16:52:33 crc kubenswrapper[4697]: I0220 16:52:33.982537 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.190:3000/\": read tcp 10.217.0.2:52020->10.217.0.190:3000: read: connection reset by peer" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.487333 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8a927bbf-7945-44bd-9c0c-1f24b0af5b9a","Type":"ContainerStarted","Data":"1d233688128ab5fda6e477cc0a22b1a319958b49b0006dd6365355e1ed70d66e"} Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.490635 4697 generic.go:334] "Generic (PLEG): container finished" podID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerID="fa06df65039534b15c98e7d14bb3f265f51319d10a2d9eff5ef2e1bbd5bd7513" exitCode=1 Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.490730 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e","Type":"ContainerDied","Data":"fa06df65039534b15c98e7d14bb3f265f51319d10a2d9eff5ef2e1bbd5bd7513"} Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.490768 4697 scope.go:117] "RemoveContainer" containerID="12cfced469d6bd5f7a254e88c6a9df5e4c3628cc075e1a01ed3fd25baff46ef4" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.491556 4697 scope.go:117] "RemoveContainer" containerID="fa06df65039534b15c98e7d14bb3f265f51319d10a2d9eff5ef2e1bbd5bd7513" Feb 20 16:52:34 crc kubenswrapper[4697]: E0220 16:52:34.491789 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c9c1a3da-b8c3-4825-ba1e-c6bbecff953e)\"" pod="openstack/watcher-decision-engine-0" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.510138 4697 generic.go:334] "Generic (PLEG): container finished" podID="671e5d87-8407-4686-93f7-1786cc51ab43" containerID="b3fe80799d3838360c1db67ef5874ba4b2074bf0aff1730e8ff615eed9e39ef4" exitCode=0 Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.510173 4697 generic.go:334] "Generic (PLEG): container finished" podID="671e5d87-8407-4686-93f7-1786cc51ab43" containerID="3832968f98762b56f14b83425d542dda9fb74b2de3c2936c0a022909e7bb466b" exitCode=2 Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.510180 4697 generic.go:334] "Generic (PLEG): container finished" podID="671e5d87-8407-4686-93f7-1786cc51ab43" containerID="f2a294635ef7a9d048fae0cbad91d12928dc12ae556ca28dc4aec4a4124e5e4f" exitCode=0 Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.510210 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"671e5d87-8407-4686-93f7-1786cc51ab43","Type":"ContainerDied","Data":"b3fe80799d3838360c1db67ef5874ba4b2074bf0aff1730e8ff615eed9e39ef4"} Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.510257 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"671e5d87-8407-4686-93f7-1786cc51ab43","Type":"ContainerDied","Data":"3832968f98762b56f14b83425d542dda9fb74b2de3c2936c0a022909e7bb466b"} Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.510269 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"671e5d87-8407-4686-93f7-1786cc51ab43","Type":"ContainerDied","Data":"f2a294635ef7a9d048fae0cbad91d12928dc12ae556ca28dc4aec4a4124e5e4f"} Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.526896 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.526877618 podStartE2EDuration="3.526877618s" podCreationTimestamp="2026-02-20 16:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:34.513676444 +0000 UTC m=+1262.293721852" watchObservedRunningTime="2026-02-20 16:52:34.526877618 +0000 UTC m=+1262.306923026" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.633401 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7d485f4f89-v5ctf"] Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.635082 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.638584 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.641098 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.641260 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.648333 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7d485f4f89-v5ctf"] Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.739764 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/536e289e-762f-4f9f-8b58-027b09cf2609-log-httpd\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.739831 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/536e289e-762f-4f9f-8b58-027b09cf2609-etc-swift\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.740014 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536e289e-762f-4f9f-8b58-027b09cf2609-config-data\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.740145 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/536e289e-762f-4f9f-8b58-027b09cf2609-internal-tls-certs\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.740213 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/536e289e-762f-4f9f-8b58-027b09cf2609-public-tls-certs\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.740263 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhz99\" (UniqueName: \"kubernetes.io/projected/536e289e-762f-4f9f-8b58-027b09cf2609-kube-api-access-jhz99\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.740344 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/536e289e-762f-4f9f-8b58-027b09cf2609-run-httpd\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.740543 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536e289e-762f-4f9f-8b58-027b09cf2609-combined-ca-bundle\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.842078 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/536e289e-762f-4f9f-8b58-027b09cf2609-log-httpd\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.842186 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/536e289e-762f-4f9f-8b58-027b09cf2609-etc-swift\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.842242 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536e289e-762f-4f9f-8b58-027b09cf2609-config-data\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.842286 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/536e289e-762f-4f9f-8b58-027b09cf2609-internal-tls-certs\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.842318 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/536e289e-762f-4f9f-8b58-027b09cf2609-public-tls-certs\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.842349 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhz99\" (UniqueName: \"kubernetes.io/projected/536e289e-762f-4f9f-8b58-027b09cf2609-kube-api-access-jhz99\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.842393 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/536e289e-762f-4f9f-8b58-027b09cf2609-run-httpd\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.842484 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536e289e-762f-4f9f-8b58-027b09cf2609-combined-ca-bundle\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.843590 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/536e289e-762f-4f9f-8b58-027b09cf2609-run-httpd\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.848846 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/536e289e-762f-4f9f-8b58-027b09cf2609-log-httpd\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.850965 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536e289e-762f-4f9f-8b58-027b09cf2609-config-data\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.855763 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/536e289e-762f-4f9f-8b58-027b09cf2609-etc-swift\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.856248 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536e289e-762f-4f9f-8b58-027b09cf2609-combined-ca-bundle\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.860236 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/536e289e-762f-4f9f-8b58-027b09cf2609-internal-tls-certs\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.862206 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhz99\" (UniqueName: \"kubernetes.io/projected/536e289e-762f-4f9f-8b58-027b09cf2609-kube-api-access-jhz99\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.863176 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/536e289e-762f-4f9f-8b58-027b09cf2609-public-tls-certs\") pod \"swift-proxy-7d485f4f89-v5ctf\" (UID: \"536e289e-762f-4f9f-8b58-027b09cf2609\") " pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:34 crc kubenswrapper[4697]: I0220 16:52:34.959171 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:35 crc kubenswrapper[4697]: I0220 16:52:35.506716 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7d485f4f89-v5ctf"] Feb 20 16:52:35 crc kubenswrapper[4697]: W0220 16:52:35.544006 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod536e289e_762f_4f9f_8b58_027b09cf2609.slice/crio-efff01d5e3ec31542691f5415f384543ce0cde6b1e7c5392ac0d616281f7eaa8 WatchSource:0}: Error finding container efff01d5e3ec31542691f5415f384543ce0cde6b1e7c5392ac0d616281f7eaa8: Status 404 returned error can't find the container with id efff01d5e3ec31542691f5415f384543ce0cde6b1e7c5392ac0d616281f7eaa8 Feb 20 16:52:36 crc kubenswrapper[4697]: I0220 16:52:36.555357 4697 generic.go:334] "Generic (PLEG): container finished" podID="671e5d87-8407-4686-93f7-1786cc51ab43" containerID="c348ac1ab4df35692172bcd20cd3859edb1839a4ecae2755cad96f2f6e751521" exitCode=0 Feb 20 16:52:36 crc kubenswrapper[4697]: I0220 16:52:36.555448 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"671e5d87-8407-4686-93f7-1786cc51ab43","Type":"ContainerDied","Data":"c348ac1ab4df35692172bcd20cd3859edb1839a4ecae2755cad96f2f6e751521"} Feb 20 16:52:36 crc kubenswrapper[4697]: I0220 16:52:36.558779 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7d485f4f89-v5ctf" event={"ID":"536e289e-762f-4f9f-8b58-027b09cf2609","Type":"ContainerStarted","Data":"5e233024b63418b4e1e2867e50fe2bb52bc75080d2ddd4d3c0d8e812b51a0bee"} Feb 20 16:52:36 crc kubenswrapper[4697]: I0220 16:52:36.558821 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7d485f4f89-v5ctf" event={"ID":"536e289e-762f-4f9f-8b58-027b09cf2609","Type":"ContainerStarted","Data":"ed017e4fb4c4f8431391d300f74af58af12d80f2f20b863a61f899f9d1d0b0e4"} Feb 20 16:52:36 crc kubenswrapper[4697]: I0220 16:52:36.558835 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7d485f4f89-v5ctf" event={"ID":"536e289e-762f-4f9f-8b58-027b09cf2609","Type":"ContainerStarted","Data":"efff01d5e3ec31542691f5415f384543ce0cde6b1e7c5392ac0d616281f7eaa8"} Feb 20 16:52:36 crc kubenswrapper[4697]: I0220 16:52:36.558953 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:36 crc kubenswrapper[4697]: I0220 16:52:36.602878 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7d485f4f89-v5ctf" podStartSLOduration=2.602861663 podStartE2EDuration="2.602861663s" podCreationTimestamp="2026-02-20 16:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:36.598875132 +0000 UTC m=+1264.378920560" watchObservedRunningTime="2026-02-20 16:52:36.602861663 +0000 UTC m=+1264.382907071" Feb 20 16:52:36 crc kubenswrapper[4697]: I0220 16:52:36.903031 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 16:52:37 crc kubenswrapper[4697]: I0220 16:52:37.126846 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bf9cb54f8-bfxkg" podUID="d6667b0d-626d-4578-9767-2026d21b1583" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.167:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.167:8443: connect: connection refused" Feb 20 16:52:37 crc kubenswrapper[4697]: I0220 16:52:37.126959 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:52:37 crc kubenswrapper[4697]: I0220 16:52:37.456057 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-76b77c89fc-t9rjg" Feb 20 16:52:37 crc kubenswrapper[4697]: I0220 16:52:37.529143 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f4865d6c6-nk2sf"] Feb 20 16:52:37 crc kubenswrapper[4697]: I0220 16:52:37.529595 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f4865d6c6-nk2sf" podUID="101ffeb3-55ef-4fbc-a42f-6532e6ff220e" containerName="neutron-api" containerID="cri-o://942e3fd4d8e91fd32f78f7db69871651847f38705f7bbfaa09ba841bce8c8ce4" gracePeriod=30 Feb 20 16:52:37 crc kubenswrapper[4697]: I0220 16:52:37.530034 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f4865d6c6-nk2sf" podUID="101ffeb3-55ef-4fbc-a42f-6532e6ff220e" containerName="neutron-httpd" containerID="cri-o://aebb766dec87808ba3cfa76453cedb421eb8d23cfe826110da5123e01b0ab6e1" gracePeriod=30 Feb 20 16:52:37 crc kubenswrapper[4697]: I0220 16:52:37.577531 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:38 crc kubenswrapper[4697]: I0220 16:52:38.589533 4697 generic.go:334] "Generic (PLEG): container finished" podID="101ffeb3-55ef-4fbc-a42f-6532e6ff220e" containerID="aebb766dec87808ba3cfa76453cedb421eb8d23cfe826110da5123e01b0ab6e1" exitCode=0 Feb 20 16:52:38 crc kubenswrapper[4697]: I0220 16:52:38.591177 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f4865d6c6-nk2sf" event={"ID":"101ffeb3-55ef-4fbc-a42f-6532e6ff220e","Type":"ContainerDied","Data":"aebb766dec87808ba3cfa76453cedb421eb8d23cfe826110da5123e01b0ab6e1"} Feb 20 16:52:39 crc kubenswrapper[4697]: I0220 16:52:39.721865 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 20 16:52:39 crc kubenswrapper[4697]: I0220 16:52:39.722256 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 20 16:52:39 crc kubenswrapper[4697]: I0220 16:52:39.725733 4697 scope.go:117] "RemoveContainer" containerID="fa06df65039534b15c98e7d14bb3f265f51319d10a2d9eff5ef2e1bbd5bd7513" Feb 20 16:52:39 crc kubenswrapper[4697]: E0220 16:52:39.726333 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c9c1a3da-b8c3-4825-ba1e-c6bbecff953e)\"" pod="openstack/watcher-decision-engine-0" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" Feb 20 16:52:40 crc kubenswrapper[4697]: I0220 16:52:40.622899 4697 generic.go:334] "Generic (PLEG): container finished" podID="101ffeb3-55ef-4fbc-a42f-6532e6ff220e" containerID="942e3fd4d8e91fd32f78f7db69871651847f38705f7bbfaa09ba841bce8c8ce4" exitCode=0 Feb 20 16:52:40 crc kubenswrapper[4697]: I0220 16:52:40.623038 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f4865d6c6-nk2sf" event={"ID":"101ffeb3-55ef-4fbc-a42f-6532e6ff220e","Type":"ContainerDied","Data":"942e3fd4d8e91fd32f78f7db69871651847f38705f7bbfaa09ba841bce8c8ce4"} Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.127889 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.310339 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.454198 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.455793 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwnkf\" (UniqueName: \"kubernetes.io/projected/671e5d87-8407-4686-93f7-1786cc51ab43-kube-api-access-nwnkf\") pod \"671e5d87-8407-4686-93f7-1786cc51ab43\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.455856 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-combined-ca-bundle\") pod \"671e5d87-8407-4686-93f7-1786cc51ab43\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.455919 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/671e5d87-8407-4686-93f7-1786cc51ab43-run-httpd\") pod \"671e5d87-8407-4686-93f7-1786cc51ab43\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.456064 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-config-data\") pod \"671e5d87-8407-4686-93f7-1786cc51ab43\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.456166 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/671e5d87-8407-4686-93f7-1786cc51ab43-log-httpd\") pod \"671e5d87-8407-4686-93f7-1786cc51ab43\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.456188 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-sg-core-conf-yaml\") pod \"671e5d87-8407-4686-93f7-1786cc51ab43\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.456215 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-scripts\") pod \"671e5d87-8407-4686-93f7-1786cc51ab43\" (UID: \"671e5d87-8407-4686-93f7-1786cc51ab43\") " Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.457168 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671e5d87-8407-4686-93f7-1786cc51ab43-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "671e5d87-8407-4686-93f7-1786cc51ab43" (UID: "671e5d87-8407-4686-93f7-1786cc51ab43"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.458084 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671e5d87-8407-4686-93f7-1786cc51ab43-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "671e5d87-8407-4686-93f7-1786cc51ab43" (UID: "671e5d87-8407-4686-93f7-1786cc51ab43"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.463715 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671e5d87-8407-4686-93f7-1786cc51ab43-kube-api-access-nwnkf" (OuterVolumeSpecName: "kube-api-access-nwnkf") pod "671e5d87-8407-4686-93f7-1786cc51ab43" (UID: "671e5d87-8407-4686-93f7-1786cc51ab43"). InnerVolumeSpecName "kube-api-access-nwnkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.463828 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-scripts" (OuterVolumeSpecName: "scripts") pod "671e5d87-8407-4686-93f7-1786cc51ab43" (UID: "671e5d87-8407-4686-93f7-1786cc51ab43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.519891 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "671e5d87-8407-4686-93f7-1786cc51ab43" (UID: "671e5d87-8407-4686-93f7-1786cc51ab43"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.562231 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-ovndb-tls-certs\") pod \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.562300 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-combined-ca-bundle\") pod \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.562330 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-httpd-config\") pod \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.562403 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-config\") pod \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.562445 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2ncf\" (UniqueName: \"kubernetes.io/projected/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-kube-api-access-j2ncf\") pod \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\" (UID: \"101ffeb3-55ef-4fbc-a42f-6532e6ff220e\") " Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.562851 4697 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/671e5d87-8407-4686-93f7-1786cc51ab43-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.562862 4697 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.562872 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.562880 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwnkf\" (UniqueName: \"kubernetes.io/projected/671e5d87-8407-4686-93f7-1786cc51ab43-kube-api-access-nwnkf\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.562889 4697 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/671e5d87-8407-4686-93f7-1786cc51ab43-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.570733 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-kube-api-access-j2ncf" (OuterVolumeSpecName: "kube-api-access-j2ncf") pod "101ffeb3-55ef-4fbc-a42f-6532e6ff220e" (UID: "101ffeb3-55ef-4fbc-a42f-6532e6ff220e"). InnerVolumeSpecName "kube-api-access-j2ncf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.591558 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "101ffeb3-55ef-4fbc-a42f-6532e6ff220e" (UID: "101ffeb3-55ef-4fbc-a42f-6532e6ff220e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.644618 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "101ffeb3-55ef-4fbc-a42f-6532e6ff220e" (UID: "101ffeb3-55ef-4fbc-a42f-6532e6ff220e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.651956 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"671e5d87-8407-4686-93f7-1786cc51ab43","Type":"ContainerDied","Data":"b4650d0d9a7354ea535ee578eb0f168ae59902ef38278f1716c5bfca7170494b"} Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.652056 4697 scope.go:117] "RemoveContainer" containerID="b3fe80799d3838360c1db67ef5874ba4b2074bf0aff1730e8ff615eed9e39ef4" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.652253 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.653609 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "671e5d87-8407-4686-93f7-1786cc51ab43" (UID: "671e5d87-8407-4686-93f7-1786cc51ab43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.665189 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.665219 4697 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.665227 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2ncf\" (UniqueName: \"kubernetes.io/projected/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-kube-api-access-j2ncf\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.665239 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.666071 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f4865d6c6-nk2sf" event={"ID":"101ffeb3-55ef-4fbc-a42f-6532e6ff220e","Type":"ContainerDied","Data":"69904eb3eb07634d33593dc165c5dd5a0fe5aa3d1f169390135bbe9ec9bfe8f5"} Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.666158 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f4865d6c6-nk2sf" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.688794 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8dae4cc2-1fb9-47ff-af11-854c15a884a3","Type":"ContainerStarted","Data":"66c01fe1afcd180c513639dd5a6301b0ece0c8ea3a74ea2826e55d2f22f41424"} Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.694597 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-config-data" (OuterVolumeSpecName: "config-data") pod "671e5d87-8407-4686-93f7-1786cc51ab43" (UID: "671e5d87-8407-4686-93f7-1786cc51ab43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.695825 4697 scope.go:117] "RemoveContainer" containerID="3832968f98762b56f14b83425d542dda9fb74b2de3c2936c0a022909e7bb466b" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.756579 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.7625031519999999 podStartE2EDuration="12.756555518s" podCreationTimestamp="2026-02-20 16:52:30 +0000 UTC" firstStartedPulling="2026-02-20 16:52:30.914712591 +0000 UTC m=+1258.694757999" lastFinishedPulling="2026-02-20 16:52:41.908764957 +0000 UTC m=+1269.688810365" observedRunningTime="2026-02-20 16:52:42.745233382 +0000 UTC m=+1270.525278810" watchObservedRunningTime="2026-02-20 16:52:42.756555518 +0000 UTC m=+1270.536600926" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.762867 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-config" (OuterVolumeSpecName: "config") pod "101ffeb3-55ef-4fbc-a42f-6532e6ff220e" (UID: "101ffeb3-55ef-4fbc-a42f-6532e6ff220e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.783461 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671e5d87-8407-4686-93f7-1786cc51ab43-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.783496 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.785848 4697 scope.go:117] "RemoveContainer" containerID="c348ac1ab4df35692172bcd20cd3859edb1839a4ecae2755cad96f2f6e751521" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.827925 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "101ffeb3-55ef-4fbc-a42f-6532e6ff220e" (UID: "101ffeb3-55ef-4fbc-a42f-6532e6ff220e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.885167 4697 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/101ffeb3-55ef-4fbc-a42f-6532e6ff220e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.924708 4697 scope.go:117] "RemoveContainer" containerID="f2a294635ef7a9d048fae0cbad91d12928dc12ae556ca28dc4aec4a4124e5e4f" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.947298 4697 scope.go:117] "RemoveContainer" containerID="aebb766dec87808ba3cfa76453cedb421eb8d23cfe826110da5123e01b0ab6e1" Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.979230 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:42 crc kubenswrapper[4697]: I0220 16:52:42.989935 4697 scope.go:117] "RemoveContainer" containerID="942e3fd4d8e91fd32f78f7db69871651847f38705f7bbfaa09ba841bce8c8ce4" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.000217 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.010711 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f4865d6c6-nk2sf"] Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.020774 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:43 crc kubenswrapper[4697]: E0220 16:52:43.021130 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="ceilometer-notification-agent" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.021147 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="ceilometer-notification-agent" Feb 20 16:52:43 crc kubenswrapper[4697]: E0220 16:52:43.021163 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="101ffeb3-55ef-4fbc-a42f-6532e6ff220e" containerName="neutron-api" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.021169 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="101ffeb3-55ef-4fbc-a42f-6532e6ff220e" containerName="neutron-api" Feb 20 16:52:43 crc kubenswrapper[4697]: E0220 16:52:43.021186 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="proxy-httpd" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.021193 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="proxy-httpd" Feb 20 16:52:43 crc kubenswrapper[4697]: E0220 16:52:43.021202 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="ceilometer-central-agent" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.021208 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="ceilometer-central-agent" Feb 20 16:52:43 crc kubenswrapper[4697]: E0220 16:52:43.021222 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="sg-core" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.021228 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="sg-core" Feb 20 16:52:43 crc kubenswrapper[4697]: E0220 16:52:43.021246 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="101ffeb3-55ef-4fbc-a42f-6532e6ff220e" containerName="neutron-httpd" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.021253 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="101ffeb3-55ef-4fbc-a42f-6532e6ff220e" containerName="neutron-httpd" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.023517 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="ceilometer-notification-agent" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.023546 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="101ffeb3-55ef-4fbc-a42f-6532e6ff220e" containerName="neutron-httpd" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.023557 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="sg-core" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.023571 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="proxy-httpd" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.023580 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" containerName="ceilometer-central-agent" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.023590 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="101ffeb3-55ef-4fbc-a42f-6532e6ff220e" containerName="neutron-api" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.028217 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.039900 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.040122 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.050305 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f4865d6c6-nk2sf"] Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.061755 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.189584 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cd1e114-c1b8-4c57-a890-abd5708ce27f-run-httpd\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.189878 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-config-data\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.189941 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.189960 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6k7b\" (UniqueName: \"kubernetes.io/projected/3cd1e114-c1b8-4c57-a890-abd5708ce27f-kube-api-access-v6k7b\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.190168 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.190262 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cd1e114-c1b8-4c57-a890-abd5708ce27f-log-httpd\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.190488 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-scripts\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.292665 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cd1e114-c1b8-4c57-a890-abd5708ce27f-run-httpd\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.292709 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-config-data\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.292857 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.292892 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6k7b\" (UniqueName: \"kubernetes.io/projected/3cd1e114-c1b8-4c57-a890-abd5708ce27f-kube-api-access-v6k7b\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.292925 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.292968 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cd1e114-c1b8-4c57-a890-abd5708ce27f-log-httpd\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.293027 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-scripts\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.294657 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cd1e114-c1b8-4c57-a890-abd5708ce27f-run-httpd\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.294959 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cd1e114-c1b8-4c57-a890-abd5708ce27f-log-httpd\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.296930 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.297821 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.299785 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-scripts\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.304588 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-config-data\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.312911 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6k7b\" (UniqueName: \"kubernetes.io/projected/3cd1e114-c1b8-4c57-a890-abd5708ce27f-kube-api-access-v6k7b\") pod \"ceilometer-0\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.354117 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.559042 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.559785 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" containerName="glance-log" containerID="cri-o://5da033a97a3e25310bf04bbbf86d1464185a251c43b5c741b61fd0b34e5d176b" gracePeriod=30 Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.560488 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" containerName="glance-httpd" containerID="cri-o://3ac5779a2c08cfdf37177a0b0971baaa2d0dd5bc7ccf8aabcba210efda1da730" gracePeriod=30 Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.716052 4697 generic.go:334] "Generic (PLEG): container finished" podID="fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" containerID="5da033a97a3e25310bf04bbbf86d1464185a251c43b5c741b61fd0b34e5d176b" exitCode=143 Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.716118 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c","Type":"ContainerDied","Data":"5da033a97a3e25310bf04bbbf86d1464185a251c43b5c741b61fd0b34e5d176b"} Feb 20 16:52:43 crc kubenswrapper[4697]: I0220 16:52:43.855589 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.286411 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.322799 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6667b0d-626d-4578-9767-2026d21b1583-config-data\") pod \"d6667b0d-626d-4578-9767-2026d21b1583\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.322858 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6667b0d-626d-4578-9767-2026d21b1583-scripts\") pod \"d6667b0d-626d-4578-9767-2026d21b1583\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.322927 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-horizon-secret-key\") pod \"d6667b0d-626d-4578-9767-2026d21b1583\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.322946 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-combined-ca-bundle\") pod \"d6667b0d-626d-4578-9767-2026d21b1583\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.368481 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6667b0d-626d-4578-9767-2026d21b1583-scripts" (OuterVolumeSpecName: "scripts") pod "d6667b0d-626d-4578-9767-2026d21b1583" (UID: "d6667b0d-626d-4578-9767-2026d21b1583"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.369914 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d6667b0d-626d-4578-9767-2026d21b1583" (UID: "d6667b0d-626d-4578-9767-2026d21b1583"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.373164 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6667b0d-626d-4578-9767-2026d21b1583-config-data" (OuterVolumeSpecName: "config-data") pod "d6667b0d-626d-4578-9767-2026d21b1583" (UID: "d6667b0d-626d-4578-9767-2026d21b1583"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.373838 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6667b0d-626d-4578-9767-2026d21b1583" (UID: "d6667b0d-626d-4578-9767-2026d21b1583"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.424869 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2chfk\" (UniqueName: \"kubernetes.io/projected/d6667b0d-626d-4578-9767-2026d21b1583-kube-api-access-2chfk\") pod \"d6667b0d-626d-4578-9767-2026d21b1583\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.424943 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-horizon-tls-certs\") pod \"d6667b0d-626d-4578-9767-2026d21b1583\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.425000 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6667b0d-626d-4578-9767-2026d21b1583-logs\") pod \"d6667b0d-626d-4578-9767-2026d21b1583\" (UID: \"d6667b0d-626d-4578-9767-2026d21b1583\") " Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.425615 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6667b0d-626d-4578-9767-2026d21b1583-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.425632 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6667b0d-626d-4578-9767-2026d21b1583-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.425645 4697 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.425656 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.426108 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6667b0d-626d-4578-9767-2026d21b1583-logs" (OuterVolumeSpecName: "logs") pod "d6667b0d-626d-4578-9767-2026d21b1583" (UID: "d6667b0d-626d-4578-9767-2026d21b1583"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.434256 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6667b0d-626d-4578-9767-2026d21b1583-kube-api-access-2chfk" (OuterVolumeSpecName: "kube-api-access-2chfk") pod "d6667b0d-626d-4578-9767-2026d21b1583" (UID: "d6667b0d-626d-4578-9767-2026d21b1583"). InnerVolumeSpecName "kube-api-access-2chfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.485589 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "d6667b0d-626d-4578-9767-2026d21b1583" (UID: "d6667b0d-626d-4578-9767-2026d21b1583"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.528503 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2chfk\" (UniqueName: \"kubernetes.io/projected/d6667b0d-626d-4578-9767-2026d21b1583-kube-api-access-2chfk\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.528565 4697 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6667b0d-626d-4578-9767-2026d21b1583-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.528577 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6667b0d-626d-4578-9767-2026d21b1583-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.734561 4697 generic.go:334] "Generic (PLEG): container finished" podID="d6667b0d-626d-4578-9767-2026d21b1583" containerID="6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d" exitCode=137 Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.734730 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bf9cb54f8-bfxkg" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.740765 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bf9cb54f8-bfxkg" event={"ID":"d6667b0d-626d-4578-9767-2026d21b1583","Type":"ContainerDied","Data":"6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d"} Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.740849 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bf9cb54f8-bfxkg" event={"ID":"d6667b0d-626d-4578-9767-2026d21b1583","Type":"ContainerDied","Data":"bb23523d88ffd28cdc578147a74321a5dda15c4bb2063b7208d669888250ee20"} Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.740875 4697 scope.go:117] "RemoveContainer" containerID="29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.745770 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cd1e114-c1b8-4c57-a890-abd5708ce27f","Type":"ContainerStarted","Data":"a09dd56597828e773d401594c72aa03914026cc5879c843c41a59bfa76e5b55d"} Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.745830 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cd1e114-c1b8-4c57-a890-abd5708ce27f","Type":"ContainerStarted","Data":"cc4d4ee346506e8d7c5d7bd173bdccf98fc8714174809e7252e1bd87d0d0d276"} Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.745849 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cd1e114-c1b8-4c57-a890-abd5708ce27f","Type":"ContainerStarted","Data":"54e9e8228aa4e264ca35670fafa68747c0a540deabba26ba0c910975c69b6c36"} Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.775930 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bf9cb54f8-bfxkg"] Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.815199 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7bf9cb54f8-bfxkg"] Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.890032 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="101ffeb3-55ef-4fbc-a42f-6532e6ff220e" path="/var/lib/kubelet/pods/101ffeb3-55ef-4fbc-a42f-6532e6ff220e/volumes" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.890824 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671e5d87-8407-4686-93f7-1786cc51ab43" path="/var/lib/kubelet/pods/671e5d87-8407-4686-93f7-1786cc51ab43/volumes" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.892332 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6667b0d-626d-4578-9767-2026d21b1583" path="/var/lib/kubelet/pods/d6667b0d-626d-4578-9767-2026d21b1583/volumes" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.929223 4697 scope.go:117] "RemoveContainer" containerID="6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.957988 4697 scope.go:117] "RemoveContainer" containerID="29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7" Feb 20 16:52:44 crc kubenswrapper[4697]: E0220 16:52:44.958660 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7\": container with ID starting with 29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7 not found: ID does not exist" containerID="29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.958694 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7"} err="failed to get container status \"29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7\": rpc error: code = NotFound desc = could not find container \"29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7\": container with ID starting with 29dfbee25b7f6733ca1c0295d89b17248a9f994009b01b0b3cbb418af6af0ed7 not found: ID does not exist" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.958720 4697 scope.go:117] "RemoveContainer" containerID="6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d" Feb 20 16:52:44 crc kubenswrapper[4697]: E0220 16:52:44.959490 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d\": container with ID starting with 6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d not found: ID does not exist" containerID="6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.959510 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d"} err="failed to get container status \"6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d\": rpc error: code = NotFound desc = could not find container \"6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d\": container with ID starting with 6d0410185532990a6b4caeb37d011a19d63b822fc60306a5183b34365f367f7d not found: ID does not exist" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.974476 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:44 crc kubenswrapper[4697]: I0220 16:52:44.975243 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7d485f4f89-v5ctf" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.180399 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7pd9r"] Feb 20 16:52:45 crc kubenswrapper[4697]: E0220 16:52:45.181101 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6667b0d-626d-4578-9767-2026d21b1583" containerName="horizon-log" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.181116 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6667b0d-626d-4578-9767-2026d21b1583" containerName="horizon-log" Feb 20 16:52:45 crc kubenswrapper[4697]: E0220 16:52:45.181146 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6667b0d-626d-4578-9767-2026d21b1583" containerName="horizon" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.181152 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6667b0d-626d-4578-9767-2026d21b1583" containerName="horizon" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.181359 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6667b0d-626d-4578-9767-2026d21b1583" containerName="horizon-log" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.181383 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6667b0d-626d-4578-9767-2026d21b1583" containerName="horizon" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.182234 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7pd9r" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.201979 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7pd9r"] Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.298843 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-j9gxf"] Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.301749 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j9gxf" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.343176 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f2fd71a-c008-4059-b649-f881e3809691-operator-scripts\") pod \"nova-api-db-create-7pd9r\" (UID: \"5f2fd71a-c008-4059-b649-f881e3809691\") " pod="openstack/nova-api-db-create-7pd9r" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.343230 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2mbt\" (UniqueName: \"kubernetes.io/projected/5f2fd71a-c008-4059-b649-f881e3809691-kube-api-access-k2mbt\") pod \"nova-api-db-create-7pd9r\" (UID: \"5f2fd71a-c008-4059-b649-f881e3809691\") " pod="openstack/nova-api-db-create-7pd9r" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.364175 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-j9gxf"] Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.408528 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1713-account-create-update-6vhb6"] Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.410135 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1713-account-create-update-6vhb6" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.415105 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.416708 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1713-account-create-update-6vhb6"] Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.445633 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3-operator-scripts\") pod \"nova-cell0-db-create-j9gxf\" (UID: \"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3\") " pod="openstack/nova-cell0-db-create-j9gxf" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.445792 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx78h\" (UniqueName: \"kubernetes.io/projected/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3-kube-api-access-tx78h\") pod \"nova-cell0-db-create-j9gxf\" (UID: \"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3\") " pod="openstack/nova-cell0-db-create-j9gxf" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.445852 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f2fd71a-c008-4059-b649-f881e3809691-operator-scripts\") pod \"nova-api-db-create-7pd9r\" (UID: \"5f2fd71a-c008-4059-b649-f881e3809691\") " pod="openstack/nova-api-db-create-7pd9r" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.445892 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2mbt\" (UniqueName: \"kubernetes.io/projected/5f2fd71a-c008-4059-b649-f881e3809691-kube-api-access-k2mbt\") pod \"nova-api-db-create-7pd9r\" (UID: \"5f2fd71a-c008-4059-b649-f881e3809691\") " pod="openstack/nova-api-db-create-7pd9r" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.446819 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f2fd71a-c008-4059-b649-f881e3809691-operator-scripts\") pod \"nova-api-db-create-7pd9r\" (UID: \"5f2fd71a-c008-4059-b649-f881e3809691\") " pod="openstack/nova-api-db-create-7pd9r" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.465649 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2mbt\" (UniqueName: \"kubernetes.io/projected/5f2fd71a-c008-4059-b649-f881e3809691-kube-api-access-k2mbt\") pod \"nova-api-db-create-7pd9r\" (UID: \"5f2fd71a-c008-4059-b649-f881e3809691\") " pod="openstack/nova-api-db-create-7pd9r" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.481449 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-jhkvr"] Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.488240 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jhkvr" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.499606 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jhkvr"] Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.516894 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7pd9r" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.550619 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dddf9\" (UniqueName: \"kubernetes.io/projected/d1fc0882-9076-4159-86d9-bf9795206baf-kube-api-access-dddf9\") pod \"nova-api-1713-account-create-update-6vhb6\" (UID: \"d1fc0882-9076-4159-86d9-bf9795206baf\") " pod="openstack/nova-api-1713-account-create-update-6vhb6" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.550680 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx78h\" (UniqueName: \"kubernetes.io/projected/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3-kube-api-access-tx78h\") pod \"nova-cell0-db-create-j9gxf\" (UID: \"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3\") " pod="openstack/nova-cell0-db-create-j9gxf" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.550703 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1fc0882-9076-4159-86d9-bf9795206baf-operator-scripts\") pod \"nova-api-1713-account-create-update-6vhb6\" (UID: \"d1fc0882-9076-4159-86d9-bf9795206baf\") " pod="openstack/nova-api-1713-account-create-update-6vhb6" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.550788 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3-operator-scripts\") pod \"nova-cell0-db-create-j9gxf\" (UID: \"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3\") " pod="openstack/nova-cell0-db-create-j9gxf" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.556573 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3-operator-scripts\") pod \"nova-cell0-db-create-j9gxf\" (UID: \"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3\") " pod="openstack/nova-cell0-db-create-j9gxf" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.585237 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx78h\" (UniqueName: \"kubernetes.io/projected/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3-kube-api-access-tx78h\") pod \"nova-cell0-db-create-j9gxf\" (UID: \"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3\") " pod="openstack/nova-cell0-db-create-j9gxf" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.619095 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-fd79-account-create-update-gz4br"] Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.622663 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fd79-account-create-update-gz4br" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.629473 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.633417 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fd79-account-create-update-gz4br"] Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.643598 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.655941 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrf5q\" (UniqueName: \"kubernetes.io/projected/97792b78-572a-47ab-9ed8-381adea8950c-kube-api-access-wrf5q\") pod \"nova-cell1-db-create-jhkvr\" (UID: \"97792b78-572a-47ab-9ed8-381adea8950c\") " pod="openstack/nova-cell1-db-create-jhkvr" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.656631 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dddf9\" (UniqueName: \"kubernetes.io/projected/d1fc0882-9076-4159-86d9-bf9795206baf-kube-api-access-dddf9\") pod \"nova-api-1713-account-create-update-6vhb6\" (UID: \"d1fc0882-9076-4159-86d9-bf9795206baf\") " pod="openstack/nova-api-1713-account-create-update-6vhb6" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.656707 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97792b78-572a-47ab-9ed8-381adea8950c-operator-scripts\") pod \"nova-cell1-db-create-jhkvr\" (UID: \"97792b78-572a-47ab-9ed8-381adea8950c\") " pod="openstack/nova-cell1-db-create-jhkvr" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.656760 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1fc0882-9076-4159-86d9-bf9795206baf-operator-scripts\") pod \"nova-api-1713-account-create-update-6vhb6\" (UID: \"d1fc0882-9076-4159-86d9-bf9795206baf\") " pod="openstack/nova-api-1713-account-create-update-6vhb6" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.659277 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1fc0882-9076-4159-86d9-bf9795206baf-operator-scripts\") pod \"nova-api-1713-account-create-update-6vhb6\" (UID: \"d1fc0882-9076-4159-86d9-bf9795206baf\") " pod="openstack/nova-api-1713-account-create-update-6vhb6" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.668346 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j9gxf" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.681757 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dddf9\" (UniqueName: \"kubernetes.io/projected/d1fc0882-9076-4159-86d9-bf9795206baf-kube-api-access-dddf9\") pod \"nova-api-1713-account-create-update-6vhb6\" (UID: \"d1fc0882-9076-4159-86d9-bf9795206baf\") " pod="openstack/nova-api-1713-account-create-update-6vhb6" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.748863 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1713-account-create-update-6vhb6" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.767072 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrf5q\" (UniqueName: \"kubernetes.io/projected/97792b78-572a-47ab-9ed8-381adea8950c-kube-api-access-wrf5q\") pod \"nova-cell1-db-create-jhkvr\" (UID: \"97792b78-572a-47ab-9ed8-381adea8950c\") " pod="openstack/nova-cell1-db-create-jhkvr" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.767143 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaa21f75-ab3a-4a94-8350-a891760b38b8-operator-scripts\") pod \"nova-cell0-fd79-account-create-update-gz4br\" (UID: \"aaa21f75-ab3a-4a94-8350-a891760b38b8\") " pod="openstack/nova-cell0-fd79-account-create-update-gz4br" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.767213 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddf92\" (UniqueName: \"kubernetes.io/projected/aaa21f75-ab3a-4a94-8350-a891760b38b8-kube-api-access-ddf92\") pod \"nova-cell0-fd79-account-create-update-gz4br\" (UID: \"aaa21f75-ab3a-4a94-8350-a891760b38b8\") " pod="openstack/nova-cell0-fd79-account-create-update-gz4br" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.767270 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97792b78-572a-47ab-9ed8-381adea8950c-operator-scripts\") pod \"nova-cell1-db-create-jhkvr\" (UID: \"97792b78-572a-47ab-9ed8-381adea8950c\") " pod="openstack/nova-cell1-db-create-jhkvr" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.769771 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97792b78-572a-47ab-9ed8-381adea8950c-operator-scripts\") pod \"nova-cell1-db-create-jhkvr\" (UID: \"97792b78-572a-47ab-9ed8-381adea8950c\") " pod="openstack/nova-cell1-db-create-jhkvr" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.853927 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrf5q\" (UniqueName: \"kubernetes.io/projected/97792b78-572a-47ab-9ed8-381adea8950c-kube-api-access-wrf5q\") pod \"nova-cell1-db-create-jhkvr\" (UID: \"97792b78-572a-47ab-9ed8-381adea8950c\") " pod="openstack/nova-cell1-db-create-jhkvr" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.869500 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-246e-account-create-update-5h8zj"] Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.870949 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaa21f75-ab3a-4a94-8350-a891760b38b8-operator-scripts\") pod \"nova-cell0-fd79-account-create-update-gz4br\" (UID: \"aaa21f75-ab3a-4a94-8350-a891760b38b8\") " pod="openstack/nova-cell0-fd79-account-create-update-gz4br" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.870995 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddf92\" (UniqueName: \"kubernetes.io/projected/aaa21f75-ab3a-4a94-8350-a891760b38b8-kube-api-access-ddf92\") pod \"nova-cell0-fd79-account-create-update-gz4br\" (UID: \"aaa21f75-ab3a-4a94-8350-a891760b38b8\") " pod="openstack/nova-cell0-fd79-account-create-update-gz4br" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.871295 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c","Type":"ContainerDied","Data":"3ac5779a2c08cfdf37177a0b0971baaa2d0dd5bc7ccf8aabcba210efda1da730"} Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.871390 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-246e-account-create-update-5h8zj" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.869780 4697 generic.go:334] "Generic (PLEG): container finished" podID="fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" containerID="3ac5779a2c08cfdf37177a0b0971baaa2d0dd5bc7ccf8aabcba210efda1da730" exitCode=0 Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.872217 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaa21f75-ab3a-4a94-8350-a891760b38b8-operator-scripts\") pod \"nova-cell0-fd79-account-create-update-gz4br\" (UID: \"aaa21f75-ab3a-4a94-8350-a891760b38b8\") " pod="openstack/nova-cell0-fd79-account-create-update-gz4br" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.878760 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.898735 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddf92\" (UniqueName: \"kubernetes.io/projected/aaa21f75-ab3a-4a94-8350-a891760b38b8-kube-api-access-ddf92\") pod \"nova-cell0-fd79-account-create-update-gz4br\" (UID: \"aaa21f75-ab3a-4a94-8350-a891760b38b8\") " pod="openstack/nova-cell0-fd79-account-create-update-gz4br" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.934629 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-246e-account-create-update-5h8zj"] Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.958478 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fd79-account-create-update-gz4br" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.977782 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfab7d0c-8b89-47a2-a01f-03e8962ccb92-operator-scripts\") pod \"nova-cell1-246e-account-create-update-5h8zj\" (UID: \"dfab7d0c-8b89-47a2-a01f-03e8962ccb92\") " pod="openstack/nova-cell1-246e-account-create-update-5h8zj" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.977966 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qrdh\" (UniqueName: \"kubernetes.io/projected/dfab7d0c-8b89-47a2-a01f-03e8962ccb92-kube-api-access-7qrdh\") pod \"nova-cell1-246e-account-create-update-5h8zj\" (UID: \"dfab7d0c-8b89-47a2-a01f-03e8962ccb92\") " pod="openstack/nova-cell1-246e-account-create-update-5h8zj" Feb 20 16:52:45 crc kubenswrapper[4697]: I0220 16:52:45.979487 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cd1e114-c1b8-4c57-a890-abd5708ce27f","Type":"ContainerStarted","Data":"ce4be8338fc3f288907259e62015d0c33d03e1eb65beb628ba9665f6c8ca85f4"} Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.088331 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qrdh\" (UniqueName: \"kubernetes.io/projected/dfab7d0c-8b89-47a2-a01f-03e8962ccb92-kube-api-access-7qrdh\") pod \"nova-cell1-246e-account-create-update-5h8zj\" (UID: \"dfab7d0c-8b89-47a2-a01f-03e8962ccb92\") " pod="openstack/nova-cell1-246e-account-create-update-5h8zj" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.088398 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfab7d0c-8b89-47a2-a01f-03e8962ccb92-operator-scripts\") pod \"nova-cell1-246e-account-create-update-5h8zj\" (UID: \"dfab7d0c-8b89-47a2-a01f-03e8962ccb92\") " pod="openstack/nova-cell1-246e-account-create-update-5h8zj" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.089284 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfab7d0c-8b89-47a2-a01f-03e8962ccb92-operator-scripts\") pod \"nova-cell1-246e-account-create-update-5h8zj\" (UID: \"dfab7d0c-8b89-47a2-a01f-03e8962ccb92\") " pod="openstack/nova-cell1-246e-account-create-update-5h8zj" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.114636 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jhkvr" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.127740 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qrdh\" (UniqueName: \"kubernetes.io/projected/dfab7d0c-8b89-47a2-a01f-03e8962ccb92-kube-api-access-7qrdh\") pod \"nova-cell1-246e-account-create-update-5h8zj\" (UID: \"dfab7d0c-8b89-47a2-a01f-03e8962ccb92\") " pod="openstack/nova-cell1-246e-account-create-update-5h8zj" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.174993 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-246e-account-create-update-5h8zj" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.302167 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.395120 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-httpd-run\") pod \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.395243 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn2rm\" (UniqueName: \"kubernetes.io/projected/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-kube-api-access-kn2rm\") pod \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.395274 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-internal-tls-certs\") pod \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.398688 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-combined-ca-bundle\") pod \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.399645 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" (UID: "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.399868 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-scripts\") pod \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.399908 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-config-data\") pod \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.400828 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.400861 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-logs\") pod \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\" (UID: \"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c\") " Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.402193 4697 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.403490 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-logs" (OuterVolumeSpecName: "logs") pod "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" (UID: "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.409729 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-kube-api-access-kn2rm" (OuterVolumeSpecName: "kube-api-access-kn2rm") pod "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" (UID: "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c"). InnerVolumeSpecName "kube-api-access-kn2rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.409898 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-scripts" (OuterVolumeSpecName: "scripts") pod "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" (UID: "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.419595 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" (UID: "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.472509 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" (UID: "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.494609 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-config-data" (OuterVolumeSpecName: "config-data") pod "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" (UID: "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.506660 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.506686 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.506718 4697 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.506729 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.506738 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn2rm\" (UniqueName: \"kubernetes.io/projected/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-kube-api-access-kn2rm\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.506749 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.536777 4697 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.568488 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" (UID: "fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.610775 4697 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.610821 4697 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.712855 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-j9gxf"] Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.729794 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7pd9r"] Feb 20 16:52:46 crc kubenswrapper[4697]: W0220 16:52:46.895737 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97792b78_572a_47ab_9ed8_381adea8950c.slice/crio-4d06de49c53b077f2b80f31670dab79e820abe967c3551fa46d4920c10b83ac7 WatchSource:0}: Error finding container 4d06de49c53b077f2b80f31670dab79e820abe967c3551fa46d4920c10b83ac7: Status 404 returned error can't find the container with id 4d06de49c53b077f2b80f31670dab79e820abe967c3551fa46d4920c10b83ac7 Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.901486 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jhkvr"] Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.914865 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fd79-account-create-update-gz4br"] Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.931504 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1713-account-create-update-6vhb6"] Feb 20 16:52:46 crc kubenswrapper[4697]: I0220 16:52:46.943554 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-246e-account-create-update-5h8zj"] Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.012148 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7pd9r" event={"ID":"5f2fd71a-c008-4059-b649-f881e3809691","Type":"ContainerStarted","Data":"a719d83d63f02f0801c212145170bbcca67c3e80b43b17f7c3d0b77b6755c674"} Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.017790 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1713-account-create-update-6vhb6" event={"ID":"d1fc0882-9076-4159-86d9-bf9795206baf","Type":"ContainerStarted","Data":"c8a561d1a03bc951523f0fce5353735ed0458fe7697cd0992bf4d846b3e1a227"} Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.018681 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j9gxf" event={"ID":"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3","Type":"ContainerStarted","Data":"ba0e1bdb7f3449e0e153a3ea40a29b893b752ef4ead230659ae1488344eeee57"} Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.020014 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jhkvr" event={"ID":"97792b78-572a-47ab-9ed8-381adea8950c","Type":"ContainerStarted","Data":"4d06de49c53b077f2b80f31670dab79e820abe967c3551fa46d4920c10b83ac7"} Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.024658 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fd79-account-create-update-gz4br" event={"ID":"aaa21f75-ab3a-4a94-8350-a891760b38b8","Type":"ContainerStarted","Data":"d4afb497efe0a82d1b62f755d57130befb082546522aecddac9b7e32a619f54a"} Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.029013 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-246e-account-create-update-5h8zj" event={"ID":"dfab7d0c-8b89-47a2-a01f-03e8962ccb92","Type":"ContainerStarted","Data":"752df5373b29f38b785169df9e1dc69c34dd15190a5cf125eeb2c32b95db49ec"} Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.034921 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c","Type":"ContainerDied","Data":"ba05bda7b682ff71d79b2ac3861a6d217cdddc22ea8e7020461aca63096e1f61"} Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.034971 4697 scope.go:117] "RemoveContainer" containerID="3ac5779a2c08cfdf37177a0b0971baaa2d0dd5bc7ccf8aabcba210efda1da730" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.035092 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.082406 4697 scope.go:117] "RemoveContainer" containerID="5da033a97a3e25310bf04bbbf86d1464185a251c43b5c741b61fd0b34e5d176b" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.095896 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.115401 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.137973 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:52:47 crc kubenswrapper[4697]: E0220 16:52:47.138335 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" containerName="glance-log" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.138352 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" containerName="glance-log" Feb 20 16:52:47 crc kubenswrapper[4697]: E0220 16:52:47.138389 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" containerName="glance-httpd" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.138396 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" containerName="glance-httpd" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.138580 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" containerName="glance-log" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.138594 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" containerName="glance-httpd" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.140078 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.142509 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.142509 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.159805 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.331512 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.331572 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqmdv\" (UniqueName: \"kubernetes.io/projected/2eec23bf-874f-423a-8d8c-b3f20b494c87-kube-api-access-xqmdv\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.331601 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eec23bf-874f-423a-8d8c-b3f20b494c87-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.331633 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2eec23bf-874f-423a-8d8c-b3f20b494c87-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.331694 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eec23bf-874f-423a-8d8c-b3f20b494c87-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.331800 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2eec23bf-874f-423a-8d8c-b3f20b494c87-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.331832 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2eec23bf-874f-423a-8d8c-b3f20b494c87-logs\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.331874 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eec23bf-874f-423a-8d8c-b3f20b494c87-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.436115 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.436167 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqmdv\" (UniqueName: \"kubernetes.io/projected/2eec23bf-874f-423a-8d8c-b3f20b494c87-kube-api-access-xqmdv\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.436199 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eec23bf-874f-423a-8d8c-b3f20b494c87-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.436226 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2eec23bf-874f-423a-8d8c-b3f20b494c87-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.436270 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eec23bf-874f-423a-8d8c-b3f20b494c87-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.436337 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2eec23bf-874f-423a-8d8c-b3f20b494c87-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.436414 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2eec23bf-874f-423a-8d8c-b3f20b494c87-logs\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.436524 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eec23bf-874f-423a-8d8c-b3f20b494c87-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.437644 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2eec23bf-874f-423a-8d8c-b3f20b494c87-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.438109 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.438836 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2eec23bf-874f-423a-8d8c-b3f20b494c87-logs\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.463146 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2eec23bf-874f-423a-8d8c-b3f20b494c87-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.479098 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eec23bf-874f-423a-8d8c-b3f20b494c87-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.479866 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eec23bf-874f-423a-8d8c-b3f20b494c87-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.483595 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eec23bf-874f-423a-8d8c-b3f20b494c87-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.486795 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqmdv\" (UniqueName: \"kubernetes.io/projected/2eec23bf-874f-423a-8d8c-b3f20b494c87-kube-api-access-xqmdv\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.527790 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2eec23bf-874f-423a-8d8c-b3f20b494c87\") " pod="openstack/glance-default-internal-api-0" Feb 20 16:52:47 crc kubenswrapper[4697]: I0220 16:52:47.763998 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.059137 4697 generic.go:334] "Generic (PLEG): container finished" podID="d1fc0882-9076-4159-86d9-bf9795206baf" containerID="75842581c54de9b4446ae5e3112d2449a92a69c03dc956adf7274152668901a2" exitCode=0 Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.059386 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1713-account-create-update-6vhb6" event={"ID":"d1fc0882-9076-4159-86d9-bf9795206baf","Type":"ContainerDied","Data":"75842581c54de9b4446ae5e3112d2449a92a69c03dc956adf7274152668901a2"} Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.072221 4697 generic.go:334] "Generic (PLEG): container finished" podID="95a0cc76-ae5d-4ed2-898b-8e9ade19ada3" containerID="fa19488dcacf05279d16a1a14ed686fdea5626f3196354026d783ddadf15c472" exitCode=0 Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.072294 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j9gxf" event={"ID":"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3","Type":"ContainerDied","Data":"fa19488dcacf05279d16a1a14ed686fdea5626f3196354026d783ddadf15c472"} Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.083212 4697 generic.go:334] "Generic (PLEG): container finished" podID="97792b78-572a-47ab-9ed8-381adea8950c" containerID="7b17efc22c4ec86c5200d4bffbdadf08889f0fec0d312fc6ff2a65b8df4b5f9a" exitCode=0 Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.083268 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jhkvr" event={"ID":"97792b78-572a-47ab-9ed8-381adea8950c","Type":"ContainerDied","Data":"7b17efc22c4ec86c5200d4bffbdadf08889f0fec0d312fc6ff2a65b8df4b5f9a"} Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.086743 4697 generic.go:334] "Generic (PLEG): container finished" podID="dfab7d0c-8b89-47a2-a01f-03e8962ccb92" containerID="576e2ecdc40c7372f87b52db3659801e13d4d186023b1c5c2867ddd37ec318be" exitCode=0 Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.086848 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-246e-account-create-update-5h8zj" event={"ID":"dfab7d0c-8b89-47a2-a01f-03e8962ccb92","Type":"ContainerDied","Data":"576e2ecdc40c7372f87b52db3659801e13d4d186023b1c5c2867ddd37ec318be"} Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.089562 4697 generic.go:334] "Generic (PLEG): container finished" podID="aaa21f75-ab3a-4a94-8350-a891760b38b8" containerID="f6e7c4cdd439befc501312edc8b2fc4d4d32159105ca6e34d18a6c535f993564" exitCode=0 Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.089620 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fd79-account-create-update-gz4br" event={"ID":"aaa21f75-ab3a-4a94-8350-a891760b38b8","Type":"ContainerDied","Data":"f6e7c4cdd439befc501312edc8b2fc4d4d32159105ca6e34d18a6c535f993564"} Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.097808 4697 generic.go:334] "Generic (PLEG): container finished" podID="5f2fd71a-c008-4059-b649-f881e3809691" containerID="1c9a012e3246ea27666bcbac62224f7dab59619a5367de1a8740b455272625e0" exitCode=0 Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.097862 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7pd9r" event={"ID":"5f2fd71a-c008-4059-b649-f881e3809691","Type":"ContainerDied","Data":"1c9a012e3246ea27666bcbac62224f7dab59619a5367de1a8740b455272625e0"} Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.114537 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cd1e114-c1b8-4c57-a890-abd5708ce27f","Type":"ContainerStarted","Data":"a85e2d0cac5a29e1de828658fb8697a39f10b0604526427faa1f373616a8c950"} Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.114717 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="ceilometer-central-agent" containerID="cri-o://cc4d4ee346506e8d7c5d7bd173bdccf98fc8714174809e7252e1bd87d0d0d276" gracePeriod=30 Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.114776 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.114835 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="sg-core" containerID="cri-o://ce4be8338fc3f288907259e62015d0c33d03e1eb65beb628ba9665f6c8ca85f4" gracePeriod=30 Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.114844 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="ceilometer-notification-agent" containerID="cri-o://a09dd56597828e773d401594c72aa03914026cc5879c843c41a59bfa76e5b55d" gracePeriod=30 Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.114923 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="proxy-httpd" containerID="cri-o://a85e2d0cac5a29e1de828658fb8697a39f10b0604526427faa1f373616a8c950" gracePeriod=30 Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.269251 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.983136042 podStartE2EDuration="6.269231922s" podCreationTimestamp="2026-02-20 16:52:42 +0000 UTC" firstStartedPulling="2026-02-20 16:52:43.872258276 +0000 UTC m=+1271.652303684" lastFinishedPulling="2026-02-20 16:52:47.158354156 +0000 UTC m=+1274.938399564" observedRunningTime="2026-02-20 16:52:48.179263197 +0000 UTC m=+1275.959308605" watchObservedRunningTime="2026-02-20 16:52:48.269231922 +0000 UTC m=+1276.049277330" Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.303680 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 16:52:48 crc kubenswrapper[4697]: I0220 16:52:48.888423 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c" path="/var/lib/kubelet/pods/fe52f5b5-82e6-4ad2-9c3f-b5d6bac35d1c/volumes" Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.129062 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2eec23bf-874f-423a-8d8c-b3f20b494c87","Type":"ContainerStarted","Data":"bc71e7be1cdb06a77c038199cc35c9b5a118397564b0397a7e4f9805a433468b"} Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.129114 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2eec23bf-874f-423a-8d8c-b3f20b494c87","Type":"ContainerStarted","Data":"f87c8494ed0b30039a0e065f5b2369a5d8ed447048adf5e7f4f42bb8e6ac3ae2"} Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.131683 4697 generic.go:334] "Generic (PLEG): container finished" podID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerID="a85e2d0cac5a29e1de828658fb8697a39f10b0604526427faa1f373616a8c950" exitCode=0 Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.131718 4697 generic.go:334] "Generic (PLEG): container finished" podID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerID="ce4be8338fc3f288907259e62015d0c33d03e1eb65beb628ba9665f6c8ca85f4" exitCode=2 Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.131730 4697 generic.go:334] "Generic (PLEG): container finished" podID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerID="a09dd56597828e773d401594c72aa03914026cc5879c843c41a59bfa76e5b55d" exitCode=0 Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.131881 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cd1e114-c1b8-4c57-a890-abd5708ce27f","Type":"ContainerDied","Data":"a85e2d0cac5a29e1de828658fb8697a39f10b0604526427faa1f373616a8c950"} Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.131906 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cd1e114-c1b8-4c57-a890-abd5708ce27f","Type":"ContainerDied","Data":"ce4be8338fc3f288907259e62015d0c33d03e1eb65beb628ba9665f6c8ca85f4"} Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.131916 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cd1e114-c1b8-4c57-a890-abd5708ce27f","Type":"ContainerDied","Data":"a09dd56597828e773d401594c72aa03914026cc5879c843c41a59bfa76e5b55d"} Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.558042 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-246e-account-create-update-5h8zj" Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.621615 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qrdh\" (UniqueName: \"kubernetes.io/projected/dfab7d0c-8b89-47a2-a01f-03e8962ccb92-kube-api-access-7qrdh\") pod \"dfab7d0c-8b89-47a2-a01f-03e8962ccb92\" (UID: \"dfab7d0c-8b89-47a2-a01f-03e8962ccb92\") " Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.621746 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfab7d0c-8b89-47a2-a01f-03e8962ccb92-operator-scripts\") pod \"dfab7d0c-8b89-47a2-a01f-03e8962ccb92\" (UID: \"dfab7d0c-8b89-47a2-a01f-03e8962ccb92\") " Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.622754 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfab7d0c-8b89-47a2-a01f-03e8962ccb92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dfab7d0c-8b89-47a2-a01f-03e8962ccb92" (UID: "dfab7d0c-8b89-47a2-a01f-03e8962ccb92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.633153 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfab7d0c-8b89-47a2-a01f-03e8962ccb92-kube-api-access-7qrdh" (OuterVolumeSpecName: "kube-api-access-7qrdh") pod "dfab7d0c-8b89-47a2-a01f-03e8962ccb92" (UID: "dfab7d0c-8b89-47a2-a01f-03e8962ccb92"). InnerVolumeSpecName "kube-api-access-7qrdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.724187 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qrdh\" (UniqueName: \"kubernetes.io/projected/dfab7d0c-8b89-47a2-a01f-03e8962ccb92-kube-api-access-7qrdh\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:49 crc kubenswrapper[4697]: I0220 16:52:49.724220 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfab7d0c-8b89-47a2-a01f-03e8962ccb92-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.049373 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jhkvr" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.081658 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7pd9r" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.094170 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fd79-account-create-update-gz4br" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.111505 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j9gxf" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.131597 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1713-account-create-update-6vhb6" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.140797 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.141777 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaa21f75-ab3a-4a94-8350-a891760b38b8-operator-scripts\") pod \"aaa21f75-ab3a-4a94-8350-a891760b38b8\" (UID: \"aaa21f75-ab3a-4a94-8350-a891760b38b8\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.141869 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddf92\" (UniqueName: \"kubernetes.io/projected/aaa21f75-ab3a-4a94-8350-a891760b38b8-kube-api-access-ddf92\") pod \"aaa21f75-ab3a-4a94-8350-a891760b38b8\" (UID: \"aaa21f75-ab3a-4a94-8350-a891760b38b8\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.141938 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrf5q\" (UniqueName: \"kubernetes.io/projected/97792b78-572a-47ab-9ed8-381adea8950c-kube-api-access-wrf5q\") pod \"97792b78-572a-47ab-9ed8-381adea8950c\" (UID: \"97792b78-572a-47ab-9ed8-381adea8950c\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.141982 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97792b78-572a-47ab-9ed8-381adea8950c-operator-scripts\") pod \"97792b78-572a-47ab-9ed8-381adea8950c\" (UID: \"97792b78-572a-47ab-9ed8-381adea8950c\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.142018 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx78h\" (UniqueName: \"kubernetes.io/projected/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3-kube-api-access-tx78h\") pod \"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3\" (UID: \"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.142073 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f2fd71a-c008-4059-b649-f881e3809691-operator-scripts\") pod \"5f2fd71a-c008-4059-b649-f881e3809691\" (UID: \"5f2fd71a-c008-4059-b649-f881e3809691\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.142156 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2mbt\" (UniqueName: \"kubernetes.io/projected/5f2fd71a-c008-4059-b649-f881e3809691-kube-api-access-k2mbt\") pod \"5f2fd71a-c008-4059-b649-f881e3809691\" (UID: \"5f2fd71a-c008-4059-b649-f881e3809691\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.142210 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaa21f75-ab3a-4a94-8350-a891760b38b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aaa21f75-ab3a-4a94-8350-a891760b38b8" (UID: "aaa21f75-ab3a-4a94-8350-a891760b38b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.142231 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3-operator-scripts\") pod \"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3\" (UID: \"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.142590 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97792b78-572a-47ab-9ed8-381adea8950c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97792b78-572a-47ab-9ed8-381adea8950c" (UID: "97792b78-572a-47ab-9ed8-381adea8950c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.142843 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaa21f75-ab3a-4a94-8350-a891760b38b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.142861 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97792b78-572a-47ab-9ed8-381adea8950c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.142865 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f2fd71a-c008-4059-b649-f881e3809691-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f2fd71a-c008-4059-b649-f881e3809691" (UID: "5f2fd71a-c008-4059-b649-f881e3809691"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.143182 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95a0cc76-ae5d-4ed2-898b-8e9ade19ada3" (UID: "95a0cc76-ae5d-4ed2-898b-8e9ade19ada3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.160540 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j9gxf" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.161332 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j9gxf" event={"ID":"95a0cc76-ae5d-4ed2-898b-8e9ade19ada3","Type":"ContainerDied","Data":"ba0e1bdb7f3449e0e153a3ea40a29b893b752ef4ead230659ae1488344eeee57"} Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.161365 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba0e1bdb7f3449e0e153a3ea40a29b893b752ef4ead230659ae1488344eeee57" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.164948 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97792b78-572a-47ab-9ed8-381adea8950c-kube-api-access-wrf5q" (OuterVolumeSpecName: "kube-api-access-wrf5q") pod "97792b78-572a-47ab-9ed8-381adea8950c" (UID: "97792b78-572a-47ab-9ed8-381adea8950c"). InnerVolumeSpecName "kube-api-access-wrf5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.165744 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f2fd71a-c008-4059-b649-f881e3809691-kube-api-access-k2mbt" (OuterVolumeSpecName: "kube-api-access-k2mbt") pod "5f2fd71a-c008-4059-b649-f881e3809691" (UID: "5f2fd71a-c008-4059-b649-f881e3809691"). InnerVolumeSpecName "kube-api-access-k2mbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.165860 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3-kube-api-access-tx78h" (OuterVolumeSpecName: "kube-api-access-tx78h") pod "95a0cc76-ae5d-4ed2-898b-8e9ade19ada3" (UID: "95a0cc76-ae5d-4ed2-898b-8e9ade19ada3"). InnerVolumeSpecName "kube-api-access-tx78h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.166508 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jhkvr" event={"ID":"97792b78-572a-47ab-9ed8-381adea8950c","Type":"ContainerDied","Data":"4d06de49c53b077f2b80f31670dab79e820abe967c3551fa46d4920c10b83ac7"} Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.166538 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d06de49c53b077f2b80f31670dab79e820abe967c3551fa46d4920c10b83ac7" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.166544 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jhkvr" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.170923 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fd79-account-create-update-gz4br" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.170920 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fd79-account-create-update-gz4br" event={"ID":"aaa21f75-ab3a-4a94-8350-a891760b38b8","Type":"ContainerDied","Data":"d4afb497efe0a82d1b62f755d57130befb082546522aecddac9b7e32a619f54a"} Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.171336 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4afb497efe0a82d1b62f755d57130befb082546522aecddac9b7e32a619f54a" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.176861 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaa21f75-ab3a-4a94-8350-a891760b38b8-kube-api-access-ddf92" (OuterVolumeSpecName: "kube-api-access-ddf92") pod "aaa21f75-ab3a-4a94-8350-a891760b38b8" (UID: "aaa21f75-ab3a-4a94-8350-a891760b38b8"). InnerVolumeSpecName "kube-api-access-ddf92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.179950 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-246e-account-create-update-5h8zj" event={"ID":"dfab7d0c-8b89-47a2-a01f-03e8962ccb92","Type":"ContainerDied","Data":"752df5373b29f38b785169df9e1dc69c34dd15190a5cf125eeb2c32b95db49ec"} Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.179987 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="752df5373b29f38b785169df9e1dc69c34dd15190a5cf125eeb2c32b95db49ec" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.180041 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-246e-account-create-update-5h8zj" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.184568 4697 generic.go:334] "Generic (PLEG): container finished" podID="7eaf42b1-80b0-43d3-8820-0b17637d46a6" containerID="cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652" exitCode=137 Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.184776 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7eaf42b1-80b0-43d3-8820-0b17637d46a6","Type":"ContainerDied","Data":"cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652"} Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.184864 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7eaf42b1-80b0-43d3-8820-0b17637d46a6","Type":"ContainerDied","Data":"4d8b7ff22e776c97ce5a9c99a456a0588c35f1d90c535e45641566069dc332e5"} Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.184933 4697 scope.go:117] "RemoveContainer" containerID="cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.185112 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.187935 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7pd9r" event={"ID":"5f2fd71a-c008-4059-b649-f881e3809691","Type":"ContainerDied","Data":"a719d83d63f02f0801c212145170bbcca67c3e80b43b17f7c3d0b77b6755c674"} Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.188094 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a719d83d63f02f0801c212145170bbcca67c3e80b43b17f7c3d0b77b6755c674" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.188137 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7pd9r" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.202619 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1713-account-create-update-6vhb6" event={"ID":"d1fc0882-9076-4159-86d9-bf9795206baf","Type":"ContainerDied","Data":"c8a561d1a03bc951523f0fce5353735ed0458fe7697cd0992bf4d846b3e1a227"} Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.202702 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8a561d1a03bc951523f0fce5353735ed0458fe7697cd0992bf4d846b3e1a227" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.202768 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1713-account-create-update-6vhb6" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.221606 4697 scope.go:117] "RemoveContainer" containerID="a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.243762 4697 scope.go:117] "RemoveContainer" containerID="cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.246350 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7eaf42b1-80b0-43d3-8820-0b17637d46a6-etc-machine-id\") pod \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.246536 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eaf42b1-80b0-43d3-8820-0b17637d46a6-logs\") pod \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.246651 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-combined-ca-bundle\") pod \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.246722 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pg6l\" (UniqueName: \"kubernetes.io/projected/7eaf42b1-80b0-43d3-8820-0b17637d46a6-kube-api-access-5pg6l\") pod \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.246851 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7eaf42b1-80b0-43d3-8820-0b17637d46a6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7eaf42b1-80b0-43d3-8820-0b17637d46a6" (UID: "7eaf42b1-80b0-43d3-8820-0b17637d46a6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.246881 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-config-data\") pod \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.247056 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-config-data-custom\") pod \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.247157 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1fc0882-9076-4159-86d9-bf9795206baf-operator-scripts\") pod \"d1fc0882-9076-4159-86d9-bf9795206baf\" (UID: \"d1fc0882-9076-4159-86d9-bf9795206baf\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.247278 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-scripts\") pod \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\" (UID: \"7eaf42b1-80b0-43d3-8820-0b17637d46a6\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.247372 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dddf9\" (UniqueName: \"kubernetes.io/projected/d1fc0882-9076-4159-86d9-bf9795206baf-kube-api-access-dddf9\") pod \"d1fc0882-9076-4159-86d9-bf9795206baf\" (UID: \"d1fc0882-9076-4159-86d9-bf9795206baf\") " Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.247954 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.248051 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddf92\" (UniqueName: \"kubernetes.io/projected/aaa21f75-ab3a-4a94-8350-a891760b38b8-kube-api-access-ddf92\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.248129 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrf5q\" (UniqueName: \"kubernetes.io/projected/97792b78-572a-47ab-9ed8-381adea8950c-kube-api-access-wrf5q\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.248204 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx78h\" (UniqueName: \"kubernetes.io/projected/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3-kube-api-access-tx78h\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.248266 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f2fd71a-c008-4059-b649-f881e3809691-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.248320 4697 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7eaf42b1-80b0-43d3-8820-0b17637d46a6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.248391 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2mbt\" (UniqueName: \"kubernetes.io/projected/5f2fd71a-c008-4059-b649-f881e3809691-kube-api-access-k2mbt\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.249187 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eaf42b1-80b0-43d3-8820-0b17637d46a6-logs" (OuterVolumeSpecName: "logs") pod "7eaf42b1-80b0-43d3-8820-0b17637d46a6" (UID: "7eaf42b1-80b0-43d3-8820-0b17637d46a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.249593 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1fc0882-9076-4159-86d9-bf9795206baf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1fc0882-9076-4159-86d9-bf9795206baf" (UID: "d1fc0882-9076-4159-86d9-bf9795206baf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: E0220 16:52:50.251839 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652\": container with ID starting with cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652 not found: ID does not exist" containerID="cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.251897 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652"} err="failed to get container status \"cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652\": rpc error: code = NotFound desc = could not find container \"cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652\": container with ID starting with cc93c1e73af611c99901391fa29ae6b0eeec5dc1799e1ac5b05def9577bfe652 not found: ID does not exist" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.251923 4697 scope.go:117] "RemoveContainer" containerID="a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.253690 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7eaf42b1-80b0-43d3-8820-0b17637d46a6" (UID: "7eaf42b1-80b0-43d3-8820-0b17637d46a6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.253747 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eaf42b1-80b0-43d3-8820-0b17637d46a6-kube-api-access-5pg6l" (OuterVolumeSpecName: "kube-api-access-5pg6l") pod "7eaf42b1-80b0-43d3-8820-0b17637d46a6" (UID: "7eaf42b1-80b0-43d3-8820-0b17637d46a6"). InnerVolumeSpecName "kube-api-access-5pg6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: E0220 16:52:50.253900 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca\": container with ID starting with a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca not found: ID does not exist" containerID="a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.253924 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca"} err="failed to get container status \"a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca\": rpc error: code = NotFound desc = could not find container \"a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca\": container with ID starting with a89f90994f524f8df379e9006e22ebbfe82f6b69b8fdbe0f86cad1821863b1ca not found: ID does not exist" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.255722 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-scripts" (OuterVolumeSpecName: "scripts") pod "7eaf42b1-80b0-43d3-8820-0b17637d46a6" (UID: "7eaf42b1-80b0-43d3-8820-0b17637d46a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.269047 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1fc0882-9076-4159-86d9-bf9795206baf-kube-api-access-dddf9" (OuterVolumeSpecName: "kube-api-access-dddf9") pod "d1fc0882-9076-4159-86d9-bf9795206baf" (UID: "d1fc0882-9076-4159-86d9-bf9795206baf"). InnerVolumeSpecName "kube-api-access-dddf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.291640 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7eaf42b1-80b0-43d3-8820-0b17637d46a6" (UID: "7eaf42b1-80b0-43d3-8820-0b17637d46a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.319272 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-config-data" (OuterVolumeSpecName: "config-data") pod "7eaf42b1-80b0-43d3-8820-0b17637d46a6" (UID: "7eaf42b1-80b0-43d3-8820-0b17637d46a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.354447 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eaf42b1-80b0-43d3-8820-0b17637d46a6-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.354479 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.354490 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pg6l\" (UniqueName: \"kubernetes.io/projected/7eaf42b1-80b0-43d3-8820-0b17637d46a6-kube-api-access-5pg6l\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.354501 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.354509 4697 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.354628 4697 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1fc0882-9076-4159-86d9-bf9795206baf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.354638 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eaf42b1-80b0-43d3-8820-0b17637d46a6-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.354647 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dddf9\" (UniqueName: \"kubernetes.io/projected/d1fc0882-9076-4159-86d9-bf9795206baf-kube-api-access-dddf9\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.519963 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.529742 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.541452 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 16:52:50 crc kubenswrapper[4697]: E0220 16:52:50.541944 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfab7d0c-8b89-47a2-a01f-03e8962ccb92" containerName="mariadb-account-create-update" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.541984 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfab7d0c-8b89-47a2-a01f-03e8962ccb92" containerName="mariadb-account-create-update" Feb 20 16:52:50 crc kubenswrapper[4697]: E0220 16:52:50.541997 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eaf42b1-80b0-43d3-8820-0b17637d46a6" containerName="cinder-api-log" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542004 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eaf42b1-80b0-43d3-8820-0b17637d46a6" containerName="cinder-api-log" Feb 20 16:52:50 crc kubenswrapper[4697]: E0220 16:52:50.542014 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a0cc76-ae5d-4ed2-898b-8e9ade19ada3" containerName="mariadb-database-create" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542021 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a0cc76-ae5d-4ed2-898b-8e9ade19ada3" containerName="mariadb-database-create" Feb 20 16:52:50 crc kubenswrapper[4697]: E0220 16:52:50.542032 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2fd71a-c008-4059-b649-f881e3809691" containerName="mariadb-database-create" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542038 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2fd71a-c008-4059-b649-f881e3809691" containerName="mariadb-database-create" Feb 20 16:52:50 crc kubenswrapper[4697]: E0220 16:52:50.542051 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97792b78-572a-47ab-9ed8-381adea8950c" containerName="mariadb-database-create" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542056 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="97792b78-572a-47ab-9ed8-381adea8950c" containerName="mariadb-database-create" Feb 20 16:52:50 crc kubenswrapper[4697]: E0220 16:52:50.542080 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eaf42b1-80b0-43d3-8820-0b17637d46a6" containerName="cinder-api" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542086 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eaf42b1-80b0-43d3-8820-0b17637d46a6" containerName="cinder-api" Feb 20 16:52:50 crc kubenswrapper[4697]: E0220 16:52:50.542095 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1fc0882-9076-4159-86d9-bf9795206baf" containerName="mariadb-account-create-update" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542101 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1fc0882-9076-4159-86d9-bf9795206baf" containerName="mariadb-account-create-update" Feb 20 16:52:50 crc kubenswrapper[4697]: E0220 16:52:50.542115 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa21f75-ab3a-4a94-8350-a891760b38b8" containerName="mariadb-account-create-update" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542121 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa21f75-ab3a-4a94-8350-a891760b38b8" containerName="mariadb-account-create-update" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542311 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa21f75-ab3a-4a94-8350-a891760b38b8" containerName="mariadb-account-create-update" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542330 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2fd71a-c008-4059-b649-f881e3809691" containerName="mariadb-database-create" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542343 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a0cc76-ae5d-4ed2-898b-8e9ade19ada3" containerName="mariadb-database-create" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542358 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1fc0882-9076-4159-86d9-bf9795206baf" containerName="mariadb-account-create-update" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542373 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eaf42b1-80b0-43d3-8820-0b17637d46a6" containerName="cinder-api" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542379 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="97792b78-572a-47ab-9ed8-381adea8950c" containerName="mariadb-database-create" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542389 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eaf42b1-80b0-43d3-8820-0b17637d46a6" containerName="cinder-api-log" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.542397 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfab7d0c-8b89-47a2-a01f-03e8962ccb92" containerName="mariadb-account-create-update" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.543663 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.546198 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.546390 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.549736 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.552393 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.659937 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.660065 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.660099 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8dd164d-3a39-4c58-99a0-1766204765bf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.660123 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.660177 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-scripts\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.660221 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8dd164d-3a39-4c58-99a0-1766204765bf-logs\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.660244 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mz6f\" (UniqueName: \"kubernetes.io/projected/e8dd164d-3a39-4c58-99a0-1766204765bf-kube-api-access-9mz6f\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.660288 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-config-data\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.660317 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.762118 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8dd164d-3a39-4c58-99a0-1766204765bf-logs\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.762188 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mz6f\" (UniqueName: \"kubernetes.io/projected/e8dd164d-3a39-4c58-99a0-1766204765bf-kube-api-access-9mz6f\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.762227 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-config-data\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.762284 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.762314 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.762373 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.762403 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8dd164d-3a39-4c58-99a0-1766204765bf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.762448 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.762516 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-scripts\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.762657 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8dd164d-3a39-4c58-99a0-1766204765bf-logs\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.762717 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8dd164d-3a39-4c58-99a0-1766204765bf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.766824 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-config-data\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.766868 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.767287 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.767938 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-scripts\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.768554 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.769886 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8dd164d-3a39-4c58-99a0-1766204765bf-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.781034 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mz6f\" (UniqueName: \"kubernetes.io/projected/e8dd164d-3a39-4c58-99a0-1766204765bf-kube-api-access-9mz6f\") pod \"cinder-api-0\" (UID: \"e8dd164d-3a39-4c58-99a0-1766204765bf\") " pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.871925 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 16:52:50 crc kubenswrapper[4697]: I0220 16:52:50.888581 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eaf42b1-80b0-43d3-8820-0b17637d46a6" path="/var/lib/kubelet/pods/7eaf42b1-80b0-43d3-8820-0b17637d46a6/volumes" Feb 20 16:52:51 crc kubenswrapper[4697]: I0220 16:52:51.215982 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2eec23bf-874f-423a-8d8c-b3f20b494c87","Type":"ContainerStarted","Data":"b31a9745ea12b3ab5a27df85a61f987925320ea78b9cda588c33e7e53e5982b5"} Feb 20 16:52:51 crc kubenswrapper[4697]: I0220 16:52:51.243021 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.243001403 podStartE2EDuration="4.243001403s" podCreationTimestamp="2026-02-20 16:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:51.234221781 +0000 UTC m=+1279.014267179" watchObservedRunningTime="2026-02-20 16:52:51.243001403 +0000 UTC m=+1279.023046811" Feb 20 16:52:51 crc kubenswrapper[4697]: I0220 16:52:51.299301 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 16:52:51 crc kubenswrapper[4697]: I0220 16:52:51.615003 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:52:51 crc kubenswrapper[4697]: I0220 16:52:51.615244 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="db97975b-6aae-4903-830f-13bfdd9a47b3" containerName="glance-log" containerID="cri-o://f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142" gracePeriod=30 Feb 20 16:52:51 crc kubenswrapper[4697]: I0220 16:52:51.615386 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="db97975b-6aae-4903-830f-13bfdd9a47b3" containerName="glance-httpd" containerID="cri-o://1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67" gracePeriod=30 Feb 20 16:52:52 crc kubenswrapper[4697]: I0220 16:52:52.231639 4697 generic.go:334] "Generic (PLEG): container finished" podID="db97975b-6aae-4903-830f-13bfdd9a47b3" containerID="f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142" exitCode=143 Feb 20 16:52:52 crc kubenswrapper[4697]: I0220 16:52:52.232115 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"db97975b-6aae-4903-830f-13bfdd9a47b3","Type":"ContainerDied","Data":"f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142"} Feb 20 16:52:52 crc kubenswrapper[4697]: I0220 16:52:52.236544 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8dd164d-3a39-4c58-99a0-1766204765bf","Type":"ContainerStarted","Data":"79f7375e6a94dd9b3b1f751d88a9e20718e28a0c31cc496a61122c099ed25068"} Feb 20 16:52:52 crc kubenswrapper[4697]: I0220 16:52:52.236580 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8dd164d-3a39-4c58-99a0-1766204765bf","Type":"ContainerStarted","Data":"894bd09d88e58e26f7363f180e9b46f37252fdb5fd543574cb67b74e8f947ec5"} Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.270288 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8dd164d-3a39-4c58-99a0-1766204765bf","Type":"ContainerStarted","Data":"42fa500e8438730b5b9925e402ff78df79b08a0331640783b8d16c2a575ab389"} Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.270904 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.311558 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.311533538 podStartE2EDuration="3.311533538s" podCreationTimestamp="2026-02-20 16:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:53.287277675 +0000 UTC m=+1281.067323093" watchObservedRunningTime="2026-02-20 16:52:53.311533538 +0000 UTC m=+1281.091578946" Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.566109 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.617925 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57d8cdd7b4-pxpls" Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.681730 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b64457bc4-cnrrj"] Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.681975 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-b64457bc4-cnrrj" podUID="15b0bf59-8d86-4210-a901-2841715a8487" containerName="placement-log" containerID="cri-o://c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279" gracePeriod=30 Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.682383 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-b64457bc4-cnrrj" podUID="15b0bf59-8d86-4210-a901-2841715a8487" containerName="placement-api" containerID="cri-o://accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a" gracePeriod=30 Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.861211 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.970173 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-public-tls-certs\") pod \"db97975b-6aae-4903-830f-13bfdd9a47b3\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.970215 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-scripts\") pod \"db97975b-6aae-4903-830f-13bfdd9a47b3\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.970252 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dl5f\" (UniqueName: \"kubernetes.io/projected/db97975b-6aae-4903-830f-13bfdd9a47b3-kube-api-access-2dl5f\") pod \"db97975b-6aae-4903-830f-13bfdd9a47b3\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.970315 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"db97975b-6aae-4903-830f-13bfdd9a47b3\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.970346 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-config-data\") pod \"db97975b-6aae-4903-830f-13bfdd9a47b3\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.970369 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db97975b-6aae-4903-830f-13bfdd9a47b3-httpd-run\") pod \"db97975b-6aae-4903-830f-13bfdd9a47b3\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.970475 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db97975b-6aae-4903-830f-13bfdd9a47b3-logs\") pod \"db97975b-6aae-4903-830f-13bfdd9a47b3\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.970509 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-combined-ca-bundle\") pod \"db97975b-6aae-4903-830f-13bfdd9a47b3\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.972695 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db97975b-6aae-4903-830f-13bfdd9a47b3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "db97975b-6aae-4903-830f-13bfdd9a47b3" (UID: "db97975b-6aae-4903-830f-13bfdd9a47b3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.973311 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db97975b-6aae-4903-830f-13bfdd9a47b3-logs" (OuterVolumeSpecName: "logs") pod "db97975b-6aae-4903-830f-13bfdd9a47b3" (UID: "db97975b-6aae-4903-830f-13bfdd9a47b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.991608 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "db97975b-6aae-4903-830f-13bfdd9a47b3" (UID: "db97975b-6aae-4903-830f-13bfdd9a47b3"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.991766 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db97975b-6aae-4903-830f-13bfdd9a47b3-kube-api-access-2dl5f" (OuterVolumeSpecName: "kube-api-access-2dl5f") pod "db97975b-6aae-4903-830f-13bfdd9a47b3" (UID: "db97975b-6aae-4903-830f-13bfdd9a47b3"). InnerVolumeSpecName "kube-api-access-2dl5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:53 crc kubenswrapper[4697]: I0220 16:52:53.992531 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-scripts" (OuterVolumeSpecName: "scripts") pod "db97975b-6aae-4903-830f-13bfdd9a47b3" (UID: "db97975b-6aae-4903-830f-13bfdd9a47b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.037556 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db97975b-6aae-4903-830f-13bfdd9a47b3" (UID: "db97975b-6aae-4903-830f-13bfdd9a47b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.066798 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "db97975b-6aae-4903-830f-13bfdd9a47b3" (UID: "db97975b-6aae-4903-830f-13bfdd9a47b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.076912 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-config-data" (OuterVolumeSpecName: "config-data") pod "db97975b-6aae-4903-830f-13bfdd9a47b3" (UID: "db97975b-6aae-4903-830f-13bfdd9a47b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.078816 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-config-data\") pod \"db97975b-6aae-4903-830f-13bfdd9a47b3\" (UID: \"db97975b-6aae-4903-830f-13bfdd9a47b3\") " Feb 20 16:52:54 crc kubenswrapper[4697]: W0220 16:52:54.079121 4697 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/db97975b-6aae-4903-830f-13bfdd9a47b3/volumes/kubernetes.io~secret/config-data Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.079136 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-config-data" (OuterVolumeSpecName: "config-data") pod "db97975b-6aae-4903-830f-13bfdd9a47b3" (UID: "db97975b-6aae-4903-830f-13bfdd9a47b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.079715 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.079737 4697 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.079746 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.079755 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dl5f\" (UniqueName: \"kubernetes.io/projected/db97975b-6aae-4903-830f-13bfdd9a47b3-kube-api-access-2dl5f\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.079777 4697 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.079786 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db97975b-6aae-4903-830f-13bfdd9a47b3-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.079795 4697 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db97975b-6aae-4903-830f-13bfdd9a47b3-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.079806 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db97975b-6aae-4903-830f-13bfdd9a47b3-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.101933 4697 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.181758 4697 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.282492 4697 generic.go:334] "Generic (PLEG): container finished" podID="db97975b-6aae-4903-830f-13bfdd9a47b3" containerID="1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67" exitCode=0 Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.282581 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"db97975b-6aae-4903-830f-13bfdd9a47b3","Type":"ContainerDied","Data":"1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67"} Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.282649 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"db97975b-6aae-4903-830f-13bfdd9a47b3","Type":"ContainerDied","Data":"9731608616d9ba8a33fa6a0d6098ea044a689220112743d8115ef5cc35a3b080"} Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.282671 4697 scope.go:117] "RemoveContainer" containerID="1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.282859 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.285367 4697 generic.go:334] "Generic (PLEG): container finished" podID="15b0bf59-8d86-4210-a901-2841715a8487" containerID="c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279" exitCode=143 Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.285446 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b64457bc4-cnrrj" event={"ID":"15b0bf59-8d86-4210-a901-2841715a8487","Type":"ContainerDied","Data":"c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279"} Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.368759 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.370400 4697 scope.go:117] "RemoveContainer" containerID="f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.398501 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.417529 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:52:54 crc kubenswrapper[4697]: E0220 16:52:54.419657 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db97975b-6aae-4903-830f-13bfdd9a47b3" containerName="glance-log" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.419676 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="db97975b-6aae-4903-830f-13bfdd9a47b3" containerName="glance-log" Feb 20 16:52:54 crc kubenswrapper[4697]: E0220 16:52:54.419692 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db97975b-6aae-4903-830f-13bfdd9a47b3" containerName="glance-httpd" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.419698 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="db97975b-6aae-4903-830f-13bfdd9a47b3" containerName="glance-httpd" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.419876 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="db97975b-6aae-4903-830f-13bfdd9a47b3" containerName="glance-httpd" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.419894 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="db97975b-6aae-4903-830f-13bfdd9a47b3" containerName="glance-log" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.421065 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.423846 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.424165 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.432750 4697 scope.go:117] "RemoveContainer" containerID="1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67" Feb 20 16:52:54 crc kubenswrapper[4697]: E0220 16:52:54.434697 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67\": container with ID starting with 1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67 not found: ID does not exist" containerID="1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.434745 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67"} err="failed to get container status \"1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67\": rpc error: code = NotFound desc = could not find container \"1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67\": container with ID starting with 1e95110d7591629b75feb77f777c8171703748dcc26dcdd377fbd946916f6b67 not found: ID does not exist" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.434899 4697 scope.go:117] "RemoveContainer" containerID="f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142" Feb 20 16:52:54 crc kubenswrapper[4697]: E0220 16:52:54.435565 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142\": container with ID starting with f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142 not found: ID does not exist" containerID="f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.435616 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142"} err="failed to get container status \"f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142\": rpc error: code = NotFound desc = could not find container \"f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142\": container with ID starting with f75cbc4f01e77a87f321d71e453351cbd4954d5e75012fdefb020ecf969fb142 not found: ID does not exist" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.440908 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.487198 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.487286 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.487308 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-config-data\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.487644 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-logs\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.487689 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7lc6\" (UniqueName: \"kubernetes.io/projected/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-kube-api-access-x7lc6\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.487848 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.487969 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.488003 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-scripts\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.553570 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.598449 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.598492 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-scripts\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.598560 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.598694 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.598775 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-config-data\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.599036 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-logs\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.599087 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7lc6\" (UniqueName: \"kubernetes.io/projected/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-kube-api-access-x7lc6\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.599195 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.599633 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.610139 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.610570 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-logs\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.626326 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.627281 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-config-data\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.627727 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-scripts\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.639912 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.663419 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7lc6\" (UniqueName: \"kubernetes.io/projected/b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0-kube-api-access-x7lc6\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.686857 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0\") " pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.704074 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-scripts\") pod \"15b0bf59-8d86-4210-a901-2841715a8487\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.704135 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9k8t\" (UniqueName: \"kubernetes.io/projected/15b0bf59-8d86-4210-a901-2841715a8487-kube-api-access-v9k8t\") pod \"15b0bf59-8d86-4210-a901-2841715a8487\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.704221 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-public-tls-certs\") pod \"15b0bf59-8d86-4210-a901-2841715a8487\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.704298 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-internal-tls-certs\") pod \"15b0bf59-8d86-4210-a901-2841715a8487\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.704328 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-config-data\") pod \"15b0bf59-8d86-4210-a901-2841715a8487\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.704399 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15b0bf59-8d86-4210-a901-2841715a8487-logs\") pod \"15b0bf59-8d86-4210-a901-2841715a8487\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.704470 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-combined-ca-bundle\") pod \"15b0bf59-8d86-4210-a901-2841715a8487\" (UID: \"15b0bf59-8d86-4210-a901-2841715a8487\") " Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.711854 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15b0bf59-8d86-4210-a901-2841715a8487-logs" (OuterVolumeSpecName: "logs") pod "15b0bf59-8d86-4210-a901-2841715a8487" (UID: "15b0bf59-8d86-4210-a901-2841715a8487"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.724655 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-scripts" (OuterVolumeSpecName: "scripts") pod "15b0bf59-8d86-4210-a901-2841715a8487" (UID: "15b0bf59-8d86-4210-a901-2841715a8487"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.734626 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15b0bf59-8d86-4210-a901-2841715a8487-kube-api-access-v9k8t" (OuterVolumeSpecName: "kube-api-access-v9k8t") pod "15b0bf59-8d86-4210-a901-2841715a8487" (UID: "15b0bf59-8d86-4210-a901-2841715a8487"). InnerVolumeSpecName "kube-api-access-v9k8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.769865 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.806124 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15b0bf59-8d86-4210-a901-2841715a8487-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.806156 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.806164 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9k8t\" (UniqueName: \"kubernetes.io/projected/15b0bf59-8d86-4210-a901-2841715a8487-kube-api-access-v9k8t\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.866023 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15b0bf59-8d86-4210-a901-2841715a8487" (UID: "15b0bf59-8d86-4210-a901-2841715a8487"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.870677 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-config-data" (OuterVolumeSpecName: "config-data") pod "15b0bf59-8d86-4210-a901-2841715a8487" (UID: "15b0bf59-8d86-4210-a901-2841715a8487"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.878125 4697 scope.go:117] "RemoveContainer" containerID="fa06df65039534b15c98e7d14bb3f265f51319d10a2d9eff5ef2e1bbd5bd7513" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.904928 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db97975b-6aae-4903-830f-13bfdd9a47b3" path="/var/lib/kubelet/pods/db97975b-6aae-4903-830f-13bfdd9a47b3/volumes" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.907733 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.907761 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.911524 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "15b0bf59-8d86-4210-a901-2841715a8487" (UID: "15b0bf59-8d86-4210-a901-2841715a8487"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:54 crc kubenswrapper[4697]: I0220 16:52:54.962526 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "15b0bf59-8d86-4210-a901-2841715a8487" (UID: "15b0bf59-8d86-4210-a901-2841715a8487"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.025249 4697 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.025287 4697 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b0bf59-8d86-4210-a901-2841715a8487-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.299714 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e","Type":"ContainerStarted","Data":"74e9e5606eff6d86cb52fc6d80fcb9b553d333adb4de8982749d84679873d906"} Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.303593 4697 generic.go:334] "Generic (PLEG): container finished" podID="15b0bf59-8d86-4210-a901-2841715a8487" containerID="accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a" exitCode=0 Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.303629 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b64457bc4-cnrrj" event={"ID":"15b0bf59-8d86-4210-a901-2841715a8487","Type":"ContainerDied","Data":"accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a"} Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.303652 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b64457bc4-cnrrj" event={"ID":"15b0bf59-8d86-4210-a901-2841715a8487","Type":"ContainerDied","Data":"595f7c98c54449c156d93e9f96d3e3378eca083413f2a8082daea62837242365"} Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.303670 4697 scope.go:117] "RemoveContainer" containerID="accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a" Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.303678 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b64457bc4-cnrrj" Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.340894 4697 scope.go:117] "RemoveContainer" containerID="c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279" Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.349519 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b64457bc4-cnrrj"] Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.357291 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b64457bc4-cnrrj"] Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.400837 4697 scope.go:117] "RemoveContainer" containerID="accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a" Feb 20 16:52:55 crc kubenswrapper[4697]: E0220 16:52:55.402818 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a\": container with ID starting with accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a not found: ID does not exist" containerID="accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a" Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.402861 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a"} err="failed to get container status \"accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a\": rpc error: code = NotFound desc = could not find container \"accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a\": container with ID starting with accdf589d9f70bc54ace1d693daf95e8e107451fd21f789bc16c691aa0cbb42a not found: ID does not exist" Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.402888 4697 scope.go:117] "RemoveContainer" containerID="c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279" Feb 20 16:52:55 crc kubenswrapper[4697]: E0220 16:52:55.403300 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279\": container with ID starting with c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279 not found: ID does not exist" containerID="c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279" Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.403363 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279"} err="failed to get container status \"c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279\": rpc error: code = NotFound desc = could not find container \"c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279\": container with ID starting with c678545563a7f17ce1a2d45571e5b62c6120ca51c1e8fd0e7d87ca86e7356279 not found: ID does not exist" Feb 20 16:52:55 crc kubenswrapper[4697]: I0220 16:52:55.457328 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.144086 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4l2dn"] Feb 20 16:52:56 crc kubenswrapper[4697]: E0220 16:52:56.144984 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b0bf59-8d86-4210-a901-2841715a8487" containerName="placement-log" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.145001 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b0bf59-8d86-4210-a901-2841715a8487" containerName="placement-log" Feb 20 16:52:56 crc kubenswrapper[4697]: E0220 16:52:56.145016 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b0bf59-8d86-4210-a901-2841715a8487" containerName="placement-api" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.145022 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b0bf59-8d86-4210-a901-2841715a8487" containerName="placement-api" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.145208 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b0bf59-8d86-4210-a901-2841715a8487" containerName="placement-log" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.145237 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b0bf59-8d86-4210-a901-2841715a8487" containerName="placement-api" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.145861 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.154449 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2rbhw" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.154712 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.156074 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.181293 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4l2dn"] Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.256908 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-scripts\") pod \"nova-cell0-conductor-db-sync-4l2dn\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.256968 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4mx\" (UniqueName: \"kubernetes.io/projected/25b9be1a-ce8d-40bb-978c-f2213d6175d2-kube-api-access-qb4mx\") pod \"nova-cell0-conductor-db-sync-4l2dn\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.257009 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-config-data\") pod \"nova-cell0-conductor-db-sync-4l2dn\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.257038 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4l2dn\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.327574 4697 generic.go:334] "Generic (PLEG): container finished" podID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerID="cc4d4ee346506e8d7c5d7bd173bdccf98fc8714174809e7252e1bd87d0d0d276" exitCode=0 Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.327664 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cd1e114-c1b8-4c57-a890-abd5708ce27f","Type":"ContainerDied","Data":"cc4d4ee346506e8d7c5d7bd173bdccf98fc8714174809e7252e1bd87d0d0d276"} Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.327716 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cd1e114-c1b8-4c57-a890-abd5708ce27f","Type":"ContainerDied","Data":"54e9e8228aa4e264ca35670fafa68747c0a540deabba26ba0c910975c69b6c36"} Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.327727 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54e9e8228aa4e264ca35670fafa68747c0a540deabba26ba0c910975c69b6c36" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.335103 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0","Type":"ContainerStarted","Data":"6e879e8c7f025b0c34d130ee28f1f3191559c7e0e620e4165d3fe4542def6006"} Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.335141 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0","Type":"ContainerStarted","Data":"892ee39e62b8f9efe3c0091bd9568755d49178b48a588e68048c7a77c26ed9cd"} Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.358478 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4mx\" (UniqueName: \"kubernetes.io/projected/25b9be1a-ce8d-40bb-978c-f2213d6175d2-kube-api-access-qb4mx\") pod \"nova-cell0-conductor-db-sync-4l2dn\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.358548 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-config-data\") pod \"nova-cell0-conductor-db-sync-4l2dn\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.358579 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4l2dn\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.358688 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-scripts\") pod \"nova-cell0-conductor-db-sync-4l2dn\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.362658 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-scripts\") pod \"nova-cell0-conductor-db-sync-4l2dn\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.363344 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4l2dn\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.371108 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-config-data\") pod \"nova-cell0-conductor-db-sync-4l2dn\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.375948 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4mx\" (UniqueName: \"kubernetes.io/projected/25b9be1a-ce8d-40bb-978c-f2213d6175d2-kube-api-access-qb4mx\") pod \"nova-cell0-conductor-db-sync-4l2dn\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.390619 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.465389 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-config-data\") pod \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.465478 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cd1e114-c1b8-4c57-a890-abd5708ce27f-log-httpd\") pod \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.465507 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-sg-core-conf-yaml\") pod \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.465576 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-combined-ca-bundle\") pod \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.465634 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6k7b\" (UniqueName: \"kubernetes.io/projected/3cd1e114-c1b8-4c57-a890-abd5708ce27f-kube-api-access-v6k7b\") pod \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.465686 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cd1e114-c1b8-4c57-a890-abd5708ce27f-run-httpd\") pod \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.465728 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-scripts\") pod \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\" (UID: \"3cd1e114-c1b8-4c57-a890-abd5708ce27f\") " Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.467169 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cd1e114-c1b8-4c57-a890-abd5708ce27f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3cd1e114-c1b8-4c57-a890-abd5708ce27f" (UID: "3cd1e114-c1b8-4c57-a890-abd5708ce27f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.467574 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cd1e114-c1b8-4c57-a890-abd5708ce27f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3cd1e114-c1b8-4c57-a890-abd5708ce27f" (UID: "3cd1e114-c1b8-4c57-a890-abd5708ce27f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.469375 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-scripts" (OuterVolumeSpecName: "scripts") pod "3cd1e114-c1b8-4c57-a890-abd5708ce27f" (UID: "3cd1e114-c1b8-4c57-a890-abd5708ce27f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.470102 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd1e114-c1b8-4c57-a890-abd5708ce27f-kube-api-access-v6k7b" (OuterVolumeSpecName: "kube-api-access-v6k7b") pod "3cd1e114-c1b8-4c57-a890-abd5708ce27f" (UID: "3cd1e114-c1b8-4c57-a890-abd5708ce27f"). InnerVolumeSpecName "kube-api-access-v6k7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.472350 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.500949 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3cd1e114-c1b8-4c57-a890-abd5708ce27f" (UID: "3cd1e114-c1b8-4c57-a890-abd5708ce27f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.563043 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cd1e114-c1b8-4c57-a890-abd5708ce27f" (UID: "3cd1e114-c1b8-4c57-a890-abd5708ce27f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.569966 4697 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cd1e114-c1b8-4c57-a890-abd5708ce27f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.570005 4697 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.570016 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.570025 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6k7b\" (UniqueName: \"kubernetes.io/projected/3cd1e114-c1b8-4c57-a890-abd5708ce27f-kube-api-access-v6k7b\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.570033 4697 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cd1e114-c1b8-4c57-a890-abd5708ce27f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.570041 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.591814 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-config-data" (OuterVolumeSpecName: "config-data") pod "3cd1e114-c1b8-4c57-a890-abd5708ce27f" (UID: "3cd1e114-c1b8-4c57-a890-abd5708ce27f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.678872 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd1e114-c1b8-4c57-a890-abd5708ce27f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:52:56 crc kubenswrapper[4697]: I0220 16:52:56.897847 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15b0bf59-8d86-4210-a901-2841715a8487" path="/var/lib/kubelet/pods/15b0bf59-8d86-4210-a901-2841715a8487/volumes" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.032688 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4l2dn"] Feb 20 16:52:57 crc kubenswrapper[4697]: W0220 16:52:57.045064 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25b9be1a_ce8d_40bb_978c_f2213d6175d2.slice/crio-bc406e40535a037b143f206a14129dbada592e999aa4a7017a503e65d855eaaf WatchSource:0}: Error finding container bc406e40535a037b143f206a14129dbada592e999aa4a7017a503e65d855eaaf: Status 404 returned error can't find the container with id bc406e40535a037b143f206a14129dbada592e999aa4a7017a503e65d855eaaf Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.354048 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4l2dn" event={"ID":"25b9be1a-ce8d-40bb-978c-f2213d6175d2","Type":"ContainerStarted","Data":"bc406e40535a037b143f206a14129dbada592e999aa4a7017a503e65d855eaaf"} Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.360133 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.361540 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0","Type":"ContainerStarted","Data":"7cb25bfda4042156e8eb1da2fd19ef0db9e3be8f8663c5d68ca2771bc6ca5c90"} Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.410683 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.410665311 podStartE2EDuration="3.410665311s" podCreationTimestamp="2026-02-20 16:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:52:57.404589388 +0000 UTC m=+1285.184634796" watchObservedRunningTime="2026-02-20 16:52:57.410665311 +0000 UTC m=+1285.190710719" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.435781 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.446111 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.459419 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:57 crc kubenswrapper[4697]: E0220 16:52:57.459875 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="proxy-httpd" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.459892 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="proxy-httpd" Feb 20 16:52:57 crc kubenswrapper[4697]: E0220 16:52:57.459907 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="ceilometer-central-agent" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.459915 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="ceilometer-central-agent" Feb 20 16:52:57 crc kubenswrapper[4697]: E0220 16:52:57.459931 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="sg-core" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.459938 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="sg-core" Feb 20 16:52:57 crc kubenswrapper[4697]: E0220 16:52:57.459956 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="ceilometer-notification-agent" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.459962 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="ceilometer-notification-agent" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.460133 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="ceilometer-central-agent" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.460148 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="ceilometer-notification-agent" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.460158 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="proxy-httpd" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.460179 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" containerName="sg-core" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.461808 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.466283 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.466790 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.469173 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.600279 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-config-data\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.600328 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-run-httpd\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.600368 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-log-httpd\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.600447 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.600533 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-scripts\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.600550 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.600575 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92vz9\" (UniqueName: \"kubernetes.io/projected/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-kube-api-access-92vz9\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.702996 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92vz9\" (UniqueName: \"kubernetes.io/projected/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-kube-api-access-92vz9\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.703075 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-config-data\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.703130 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-run-httpd\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.703165 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-log-httpd\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.703285 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.703798 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-run-httpd\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.704123 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-log-httpd\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.705049 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-scripts\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.705105 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.721950 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-config-data\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.724971 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.726033 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.732276 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-scripts\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.738006 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92vz9\" (UniqueName: \"kubernetes.io/projected/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-kube-api-access-92vz9\") pod \"ceilometer-0\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " pod="openstack/ceilometer-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.767298 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.767714 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.800121 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.808357 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:57 crc kubenswrapper[4697]: I0220 16:52:57.808797 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:52:58 crc kubenswrapper[4697]: I0220 16:52:58.333685 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:52:58 crc kubenswrapper[4697]: W0220 16:52:58.339835 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e1c0579_6d7c_40f9_93b6_f4dfef660bd5.slice/crio-35db779a6315140dd7af6ca0062d063da3b05b114112c09a36f81a483080817a WatchSource:0}: Error finding container 35db779a6315140dd7af6ca0062d063da3b05b114112c09a36f81a483080817a: Status 404 returned error can't find the container with id 35db779a6315140dd7af6ca0062d063da3b05b114112c09a36f81a483080817a Feb 20 16:52:58 crc kubenswrapper[4697]: I0220 16:52:58.372885 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5","Type":"ContainerStarted","Data":"35db779a6315140dd7af6ca0062d063da3b05b114112c09a36f81a483080817a"} Feb 20 16:52:58 crc kubenswrapper[4697]: I0220 16:52:58.373187 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:58 crc kubenswrapper[4697]: I0220 16:52:58.373215 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 16:52:58 crc kubenswrapper[4697]: I0220 16:52:58.889401 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd1e114-c1b8-4c57-a890-abd5708ce27f" path="/var/lib/kubelet/pods/3cd1e114-c1b8-4c57-a890-abd5708ce27f/volumes" Feb 20 16:52:59 crc kubenswrapper[4697]: I0220 16:52:59.385216 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5","Type":"ContainerStarted","Data":"d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d"} Feb 20 16:52:59 crc kubenswrapper[4697]: I0220 16:52:59.385526 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5","Type":"ContainerStarted","Data":"bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1"} Feb 20 16:52:59 crc kubenswrapper[4697]: I0220 16:52:59.721869 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 20 16:52:59 crc kubenswrapper[4697]: I0220 16:52:59.721924 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 20 16:52:59 crc kubenswrapper[4697]: I0220 16:52:59.752455 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 20 16:53:00 crc kubenswrapper[4697]: I0220 16:53:00.391537 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 16:53:00 crc kubenswrapper[4697]: I0220 16:53:00.402195 4697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 16:53:00 crc kubenswrapper[4697]: I0220 16:53:00.403004 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5","Type":"ContainerStarted","Data":"da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4"} Feb 20 16:53:00 crc kubenswrapper[4697]: I0220 16:53:00.444830 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 20 16:53:00 crc kubenswrapper[4697]: I0220 16:53:00.487634 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:53:00 crc kubenswrapper[4697]: I0220 16:53:00.717320 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 16:53:01 crc kubenswrapper[4697]: I0220 16:53:01.051217 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:53:01 crc kubenswrapper[4697]: I0220 16:53:01.184901 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:53:01 crc kubenswrapper[4697]: I0220 16:53:01.184960 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.423901 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" containerID="cri-o://74e9e5606eff6d86cb52fc6d80fcb9b553d333adb4de8982749d84679873d906" gracePeriod=30 Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.425338 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5","Type":"ContainerStarted","Data":"677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7"} Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.425877 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.425539 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="sg-core" containerID="cri-o://da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4" gracePeriod=30 Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.425567 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="ceilometer-notification-agent" containerID="cri-o://d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d" gracePeriod=30 Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.425521 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="proxy-httpd" containerID="cri-o://677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7" gracePeriod=30 Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.425505 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="ceilometer-central-agent" containerID="cri-o://bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1" gracePeriod=30 Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.840062 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.510589891 podStartE2EDuration="5.840042738s" podCreationTimestamp="2026-02-20 16:52:57 +0000 UTC" firstStartedPulling="2026-02-20 16:52:58.342872578 +0000 UTC m=+1286.122917986" lastFinishedPulling="2026-02-20 16:53:01.672325425 +0000 UTC m=+1289.452370833" observedRunningTime="2026-02-20 16:53:02.454170659 +0000 UTC m=+1290.234216067" watchObservedRunningTime="2026-02-20 16:53:02.840042738 +0000 UTC m=+1290.620088146" Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.846064 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.847735 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="1e5e751a-8226-4445-87ff-2347c603df7c" containerName="watcher-applier" containerID="cri-o://f8b52430b475ec36694d4d83616a15f714bcdd3a471de49f37d537b0228a5aa1" gracePeriod=30 Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.865857 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.866182 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerName="watcher-api" containerID="cri-o://6326de61069a59dad5f3e03f2dd41b7639ae730654c051a3708c8ceb947aada2" gracePeriod=30 Feb 20 16:53:02 crc kubenswrapper[4697]: I0220 16:53:02.866102 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerName="watcher-api-log" containerID="cri-o://7ba3aace87cb8a1a606f5248c0e0cf42eb249fb39aa36e0ff142c0657d42b5e4" gracePeriod=30 Feb 20 16:53:03 crc kubenswrapper[4697]: I0220 16:53:03.067179 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 20 16:53:03 crc kubenswrapper[4697]: I0220 16:53:03.453415 4697 generic.go:334] "Generic (PLEG): container finished" podID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerID="677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7" exitCode=0 Feb 20 16:53:03 crc kubenswrapper[4697]: I0220 16:53:03.453464 4697 generic.go:334] "Generic (PLEG): container finished" podID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerID="da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4" exitCode=2 Feb 20 16:53:03 crc kubenswrapper[4697]: I0220 16:53:03.453471 4697 generic.go:334] "Generic (PLEG): container finished" podID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerID="d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d" exitCode=0 Feb 20 16:53:03 crc kubenswrapper[4697]: I0220 16:53:03.453512 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5","Type":"ContainerDied","Data":"677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7"} Feb 20 16:53:03 crc kubenswrapper[4697]: I0220 16:53:03.453539 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5","Type":"ContainerDied","Data":"da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4"} Feb 20 16:53:03 crc kubenswrapper[4697]: I0220 16:53:03.453547 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5","Type":"ContainerDied","Data":"d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d"} Feb 20 16:53:03 crc kubenswrapper[4697]: I0220 16:53:03.476069 4697 generic.go:334] "Generic (PLEG): container finished" podID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerID="7ba3aace87cb8a1a606f5248c0e0cf42eb249fb39aa36e0ff142c0657d42b5e4" exitCode=143 Feb 20 16:53:03 crc kubenswrapper[4697]: I0220 16:53:03.476111 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f0d9f5b2-5617-4e75-97f2-f00513a5be67","Type":"ContainerDied","Data":"7ba3aace87cb8a1a606f5248c0e0cf42eb249fb39aa36e0ff142c0657d42b5e4"} Feb 20 16:53:04 crc kubenswrapper[4697]: I0220 16:53:04.052601 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.185:9322/\": dial tcp 10.217.0.185:9322: connect: connection refused" Feb 20 16:53:04 crc kubenswrapper[4697]: I0220 16:53:04.052867 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.0.185:9322/\": dial tcp 10.217.0.185:9322: connect: connection refused" Feb 20 16:53:04 crc kubenswrapper[4697]: I0220 16:53:04.489118 4697 generic.go:334] "Generic (PLEG): container finished" podID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerID="74e9e5606eff6d86cb52fc6d80fcb9b553d333adb4de8982749d84679873d906" exitCode=0 Feb 20 16:53:04 crc kubenswrapper[4697]: I0220 16:53:04.489198 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e","Type":"ContainerDied","Data":"74e9e5606eff6d86cb52fc6d80fcb9b553d333adb4de8982749d84679873d906"} Feb 20 16:53:04 crc kubenswrapper[4697]: I0220 16:53:04.489557 4697 scope.go:117] "RemoveContainer" containerID="fa06df65039534b15c98e7d14bb3f265f51319d10a2d9eff5ef2e1bbd5bd7513" Feb 20 16:53:04 crc kubenswrapper[4697]: I0220 16:53:04.492941 4697 generic.go:334] "Generic (PLEG): container finished" podID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerID="6326de61069a59dad5f3e03f2dd41b7639ae730654c051a3708c8ceb947aada2" exitCode=0 Feb 20 16:53:04 crc kubenswrapper[4697]: I0220 16:53:04.492971 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f0d9f5b2-5617-4e75-97f2-f00513a5be67","Type":"ContainerDied","Data":"6326de61069a59dad5f3e03f2dd41b7639ae730654c051a3708c8ceb947aada2"} Feb 20 16:53:04 crc kubenswrapper[4697]: E0220 16:53:04.594280 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f8b52430b475ec36694d4d83616a15f714bcdd3a471de49f37d537b0228a5aa1" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 20 16:53:04 crc kubenswrapper[4697]: E0220 16:53:04.595719 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f8b52430b475ec36694d4d83616a15f714bcdd3a471de49f37d537b0228a5aa1" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 20 16:53:04 crc kubenswrapper[4697]: E0220 16:53:04.602327 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f8b52430b475ec36694d4d83616a15f714bcdd3a471de49f37d537b0228a5aa1" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 20 16:53:04 crc kubenswrapper[4697]: E0220 16:53:04.602403 4697 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="1e5e751a-8226-4445-87ff-2347c603df7c" containerName="watcher-applier" Feb 20 16:53:04 crc kubenswrapper[4697]: I0220 16:53:04.773134 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 16:53:04 crc kubenswrapper[4697]: I0220 16:53:04.773174 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 16:53:04 crc kubenswrapper[4697]: I0220 16:53:04.810753 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 16:53:04 crc kubenswrapper[4697]: I0220 16:53:04.820503 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 16:53:05 crc kubenswrapper[4697]: I0220 16:53:05.505168 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 16:53:05 crc kubenswrapper[4697]: I0220 16:53:05.505486 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 16:53:07 crc kubenswrapper[4697]: I0220 16:53:07.423375 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 16:53:07 crc kubenswrapper[4697]: I0220 16:53:07.424200 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 16:53:09 crc kubenswrapper[4697]: I0220 16:53:09.052699 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.0.185:9322/\": dial tcp 10.217.0.185:9322: connect: connection refused" Feb 20 16:53:09 crc kubenswrapper[4697]: I0220 16:53:09.052725 4697 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.185:9322/\": dial tcp 10.217.0.185:9322: connect: connection refused" Feb 20 16:53:09 crc kubenswrapper[4697]: E0220 16:53:09.594666 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f8b52430b475ec36694d4d83616a15f714bcdd3a471de49f37d537b0228a5aa1" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 20 16:53:09 crc kubenswrapper[4697]: E0220 16:53:09.596368 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f8b52430b475ec36694d4d83616a15f714bcdd3a471de49f37d537b0228a5aa1" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 20 16:53:09 crc kubenswrapper[4697]: E0220 16:53:09.597895 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f8b52430b475ec36694d4d83616a15f714bcdd3a471de49f37d537b0228a5aa1" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 20 16:53:09 crc kubenswrapper[4697]: E0220 16:53:09.598053 4697 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="1e5e751a-8226-4445-87ff-2347c603df7c" containerName="watcher-applier" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.549875 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4l2dn" event={"ID":"25b9be1a-ce8d-40bb-978c-f2213d6175d2","Type":"ContainerStarted","Data":"a7e9786095ab3c9d0b955842128535d2ce274ccfb0d497ab87627ef4fad4a57c"} Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.568486 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-4l2dn" podStartSLOduration=1.378065117 podStartE2EDuration="14.568468161s" podCreationTimestamp="2026-02-20 16:52:56 +0000 UTC" firstStartedPulling="2026-02-20 16:52:57.046612724 +0000 UTC m=+1284.826658132" lastFinishedPulling="2026-02-20 16:53:10.237015778 +0000 UTC m=+1298.017061176" observedRunningTime="2026-02-20 16:53:10.566307616 +0000 UTC m=+1298.346353024" watchObservedRunningTime="2026-02-20 16:53:10.568468161 +0000 UTC m=+1298.348513569" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.573012 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.573229 4697 generic.go:334] "Generic (PLEG): container finished" podID="1e5e751a-8226-4445-87ff-2347c603df7c" containerID="f8b52430b475ec36694d4d83616a15f714bcdd3a471de49f37d537b0228a5aa1" exitCode=0 Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.573289 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"1e5e751a-8226-4445-87ff-2347c603df7c","Type":"ContainerDied","Data":"f8b52430b475ec36694d4d83616a15f714bcdd3a471de49f37d537b0228a5aa1"} Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.652316 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-combined-ca-bundle\") pod \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.652713 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-config-data\") pod \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.652748 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-internal-tls-certs\") pod \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.652787 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d9f5b2-5617-4e75-97f2-f00513a5be67-logs\") pod \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.652827 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-public-tls-certs\") pod \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.652916 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg8nk\" (UniqueName: \"kubernetes.io/projected/f0d9f5b2-5617-4e75-97f2-f00513a5be67-kube-api-access-dg8nk\") pod \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.653003 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-custom-prometheus-ca\") pod \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\" (UID: \"f0d9f5b2-5617-4e75-97f2-f00513a5be67\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.668750 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d9f5b2-5617-4e75-97f2-f00513a5be67-logs" (OuterVolumeSpecName: "logs") pod "f0d9f5b2-5617-4e75-97f2-f00513a5be67" (UID: "f0d9f5b2-5617-4e75-97f2-f00513a5be67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.687248 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d9f5b2-5617-4e75-97f2-f00513a5be67-kube-api-access-dg8nk" (OuterVolumeSpecName: "kube-api-access-dg8nk") pod "f0d9f5b2-5617-4e75-97f2-f00513a5be67" (UID: "f0d9f5b2-5617-4e75-97f2-f00513a5be67"). InnerVolumeSpecName "kube-api-access-dg8nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.723352 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "f0d9f5b2-5617-4e75-97f2-f00513a5be67" (UID: "f0d9f5b2-5617-4e75-97f2-f00513a5be67"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.733172 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.739459 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.743837 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0d9f5b2-5617-4e75-97f2-f00513a5be67" (UID: "f0d9f5b2-5617-4e75-97f2-f00513a5be67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.755826 4697 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.755850 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.755859 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d9f5b2-5617-4e75-97f2-f00513a5be67-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.755870 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg8nk\" (UniqueName: \"kubernetes.io/projected/f0d9f5b2-5617-4e75-97f2-f00513a5be67-kube-api-access-dg8nk\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.764614 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f0d9f5b2-5617-4e75-97f2-f00513a5be67" (UID: "f0d9f5b2-5617-4e75-97f2-f00513a5be67"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.776197 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-config-data" (OuterVolumeSpecName: "config-data") pod "f0d9f5b2-5617-4e75-97f2-f00513a5be67" (UID: "f0d9f5b2-5617-4e75-97f2-f00513a5be67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.786363 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f0d9f5b2-5617-4e75-97f2-f00513a5be67" (UID: "f0d9f5b2-5617-4e75-97f2-f00513a5be67"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.857955 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-custom-prometheus-ca\") pod \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.858057 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-config-data\") pod \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.858085 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzc5k\" (UniqueName: \"kubernetes.io/projected/1e5e751a-8226-4445-87ff-2347c603df7c-kube-api-access-xzc5k\") pod \"1e5e751a-8226-4445-87ff-2347c603df7c\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.858500 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llg97\" (UniqueName: \"kubernetes.io/projected/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-kube-api-access-llg97\") pod \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.858552 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-logs\") pod \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.858587 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5e751a-8226-4445-87ff-2347c603df7c-config-data\") pod \"1e5e751a-8226-4445-87ff-2347c603df7c\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.858687 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-combined-ca-bundle\") pod \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\" (UID: \"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.858729 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5e751a-8226-4445-87ff-2347c603df7c-logs\") pod \"1e5e751a-8226-4445-87ff-2347c603df7c\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.858754 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5e751a-8226-4445-87ff-2347c603df7c-combined-ca-bundle\") pod \"1e5e751a-8226-4445-87ff-2347c603df7c\" (UID: \"1e5e751a-8226-4445-87ff-2347c603df7c\") " Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.859358 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.859382 4697 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.859394 4697 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d9f5b2-5617-4e75-97f2-f00513a5be67-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.859802 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-logs" (OuterVolumeSpecName: "logs") pod "c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" (UID: "c9c1a3da-b8c3-4825-ba1e-c6bbecff953e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.860010 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e5e751a-8226-4445-87ff-2347c603df7c-logs" (OuterVolumeSpecName: "logs") pod "1e5e751a-8226-4445-87ff-2347c603df7c" (UID: "1e5e751a-8226-4445-87ff-2347c603df7c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.862391 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-kube-api-access-llg97" (OuterVolumeSpecName: "kube-api-access-llg97") pod "c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" (UID: "c9c1a3da-b8c3-4825-ba1e-c6bbecff953e"). InnerVolumeSpecName "kube-api-access-llg97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.863586 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e5e751a-8226-4445-87ff-2347c603df7c-kube-api-access-xzc5k" (OuterVolumeSpecName: "kube-api-access-xzc5k") pod "1e5e751a-8226-4445-87ff-2347c603df7c" (UID: "1e5e751a-8226-4445-87ff-2347c603df7c"). InnerVolumeSpecName "kube-api-access-xzc5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.887045 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e5e751a-8226-4445-87ff-2347c603df7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e5e751a-8226-4445-87ff-2347c603df7c" (UID: "1e5e751a-8226-4445-87ff-2347c603df7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.887817 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" (UID: "c9c1a3da-b8c3-4825-ba1e-c6bbecff953e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.910924 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" (UID: "c9c1a3da-b8c3-4825-ba1e-c6bbecff953e"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.918264 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-config-data" (OuterVolumeSpecName: "config-data") pod "c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" (UID: "c9c1a3da-b8c3-4825-ba1e-c6bbecff953e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.938900 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e5e751a-8226-4445-87ff-2347c603df7c-config-data" (OuterVolumeSpecName: "config-data") pod "1e5e751a-8226-4445-87ff-2347c603df7c" (UID: "1e5e751a-8226-4445-87ff-2347c603df7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.962510 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5e751a-8226-4445-87ff-2347c603df7c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.962540 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.962551 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5e751a-8226-4445-87ff-2347c603df7c-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.962560 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5e751a-8226-4445-87ff-2347c603df7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.962570 4697 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.962578 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.962587 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzc5k\" (UniqueName: \"kubernetes.io/projected/1e5e751a-8226-4445-87ff-2347c603df7c-kube-api-access-xzc5k\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.962595 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llg97\" (UniqueName: \"kubernetes.io/projected/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-kube-api-access-llg97\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:10 crc kubenswrapper[4697]: I0220 16:53:10.962602 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.585633 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"f0d9f5b2-5617-4e75-97f2-f00513a5be67","Type":"ContainerDied","Data":"1c4ae7101858cf283d4c95e48d311778ba207a3963e602fcf30f60474ed88985"} Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.585747 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.585978 4697 scope.go:117] "RemoveContainer" containerID="6326de61069a59dad5f3e03f2dd41b7639ae730654c051a3708c8ceb947aada2" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.587547 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"1e5e751a-8226-4445-87ff-2347c603df7c","Type":"ContainerDied","Data":"5c51ca9b955c64a77b57a92ac8def5c6aca40c7b590478af1d754b6b68b45944"} Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.587576 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.589888 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.589897 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c9c1a3da-b8c3-4825-ba1e-c6bbecff953e","Type":"ContainerDied","Data":"4cefd8bf27bd23573d851816f6013edc60787f587573f377a33e6c9ef3e379a3"} Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.615589 4697 scope.go:117] "RemoveContainer" containerID="7ba3aace87cb8a1a606f5248c0e0cf42eb249fb39aa36e0ff142c0657d42b5e4" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.628720 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.658626 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.661201 4697 scope.go:117] "RemoveContainer" containerID="f8b52430b475ec36694d4d83616a15f714bcdd3a471de49f37d537b0228a5aa1" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.711808 4697 scope.go:117] "RemoveContainer" containerID="74e9e5606eff6d86cb52fc6d80fcb9b553d333adb4de8982749d84679873d906" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.713277 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:53:11 crc kubenswrapper[4697]: E0220 16:53:11.714196 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.714238 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" Feb 20 16:53:11 crc kubenswrapper[4697]: E0220 16:53:11.714264 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerName="watcher-api" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.714270 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerName="watcher-api" Feb 20 16:53:11 crc kubenswrapper[4697]: E0220 16:53:11.714292 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.714297 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" Feb 20 16:53:11 crc kubenswrapper[4697]: E0220 16:53:11.714312 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.714319 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" Feb 20 16:53:11 crc kubenswrapper[4697]: E0220 16:53:11.714336 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerName="watcher-api-log" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.714343 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerName="watcher-api-log" Feb 20 16:53:11 crc kubenswrapper[4697]: E0220 16:53:11.714359 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5e751a-8226-4445-87ff-2347c603df7c" containerName="watcher-applier" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.714366 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5e751a-8226-4445-87ff-2347c603df7c" containerName="watcher-applier" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.715487 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerName="watcher-api" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.715515 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.715530 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.715546 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.715557 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e5e751a-8226-4445-87ff-2347c603df7c" containerName="watcher-applier" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.715572 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" containerName="watcher-api-log" Feb 20 16:53:11 crc kubenswrapper[4697]: E0220 16:53:11.715865 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.715878 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.716227 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" containerName="watcher-decision-engine" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.717082 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.721361 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-4pjll" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.721605 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.725230 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.728130 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.747668 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.758052 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.766998 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.775129 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.782894 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.790282 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.791842 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.793937 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.798621 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.799574 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.802623 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.806630 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.817013 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.877958 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f4378d-bb26-408d-9613-82246765639b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878036 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-config-data\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878091 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21f4378d-bb26-408d-9613-82246765639b-logs\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878119 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csxsj\" (UniqueName: \"kubernetes.io/projected/21f4378d-bb26-408d-9613-82246765639b-kube-api-access-csxsj\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878173 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878209 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b3fb67-823a-4da5-a42d-27745717ba8b-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"14b3fb67-823a-4da5-a42d-27745717ba8b\") " pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878290 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f4378d-bb26-408d-9613-82246765639b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878313 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzb66\" (UniqueName: \"kubernetes.io/projected/04b1d7c5-4d8e-4962-b396-855adb2605d7-kube-api-access-vzb66\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878408 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878481 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-public-tls-certs\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878597 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878742 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14b3fb67-823a-4da5-a42d-27745717ba8b-logs\") pod \"watcher-applier-0\" (UID: \"14b3fb67-823a-4da5-a42d-27745717ba8b\") " pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878920 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd296\" (UniqueName: \"kubernetes.io/projected/14b3fb67-823a-4da5-a42d-27745717ba8b-kube-api-access-qd296\") pod \"watcher-applier-0\" (UID: \"14b3fb67-823a-4da5-a42d-27745717ba8b\") " pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.878961 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/21f4378d-bb26-408d-9613-82246765639b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.879011 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b3fb67-823a-4da5-a42d-27745717ba8b-config-data\") pod \"watcher-applier-0\" (UID: \"14b3fb67-823a-4da5-a42d-27745717ba8b\") " pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.879038 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04b1d7c5-4d8e-4962-b396-855adb2605d7-logs\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980392 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21f4378d-bb26-408d-9613-82246765639b-logs\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980458 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csxsj\" (UniqueName: \"kubernetes.io/projected/21f4378d-bb26-408d-9613-82246765639b-kube-api-access-csxsj\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980489 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980517 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b3fb67-823a-4da5-a42d-27745717ba8b-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"14b3fb67-823a-4da5-a42d-27745717ba8b\") " pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980553 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f4378d-bb26-408d-9613-82246765639b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980583 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzb66\" (UniqueName: \"kubernetes.io/projected/04b1d7c5-4d8e-4962-b396-855adb2605d7-kube-api-access-vzb66\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980628 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980667 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-public-tls-certs\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980731 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980774 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14b3fb67-823a-4da5-a42d-27745717ba8b-logs\") pod \"watcher-applier-0\" (UID: \"14b3fb67-823a-4da5-a42d-27745717ba8b\") " pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980841 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd296\" (UniqueName: \"kubernetes.io/projected/14b3fb67-823a-4da5-a42d-27745717ba8b-kube-api-access-qd296\") pod \"watcher-applier-0\" (UID: \"14b3fb67-823a-4da5-a42d-27745717ba8b\") " pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980865 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/21f4378d-bb26-408d-9613-82246765639b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980897 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b3fb67-823a-4da5-a42d-27745717ba8b-config-data\") pod \"watcher-applier-0\" (UID: \"14b3fb67-823a-4da5-a42d-27745717ba8b\") " pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980940 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04b1d7c5-4d8e-4962-b396-855adb2605d7-logs\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.980988 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f4378d-bb26-408d-9613-82246765639b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.981019 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-config-data\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.981966 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04b1d7c5-4d8e-4962-b396-855adb2605d7-logs\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.983055 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14b3fb67-823a-4da5-a42d-27745717ba8b-logs\") pod \"watcher-applier-0\" (UID: \"14b3fb67-823a-4da5-a42d-27745717ba8b\") " pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.983954 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21f4378d-bb26-408d-9613-82246765639b-logs\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.985175 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b3fb67-823a-4da5-a42d-27745717ba8b-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"14b3fb67-823a-4da5-a42d-27745717ba8b\") " pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.985353 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f4378d-bb26-408d-9613-82246765639b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.985416 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.985901 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f4378d-bb26-408d-9613-82246765639b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.986267 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.986641 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b3fb67-823a-4da5-a42d-27745717ba8b-config-data\") pod \"watcher-applier-0\" (UID: \"14b3fb67-823a-4da5-a42d-27745717ba8b\") " pod="openstack/watcher-applier-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.989826 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/21f4378d-bb26-408d-9613-82246765639b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:11 crc kubenswrapper[4697]: I0220 16:53:11.998816 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-public-tls-certs\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:11.999779 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd296\" (UniqueName: \"kubernetes.io/projected/14b3fb67-823a-4da5-a42d-27745717ba8b-kube-api-access-qd296\") pod \"watcher-applier-0\" (UID: \"14b3fb67-823a-4da5-a42d-27745717ba8b\") " pod="openstack/watcher-applier-0" Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.000595 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.001816 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b1d7c5-4d8e-4962-b396-855adb2605d7-config-data\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.007578 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csxsj\" (UniqueName: \"kubernetes.io/projected/21f4378d-bb26-408d-9613-82246765639b-kube-api-access-csxsj\") pod \"watcher-decision-engine-0\" (UID: \"21f4378d-bb26-408d-9613-82246765639b\") " pod="openstack/watcher-decision-engine-0" Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.015250 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzb66\" (UniqueName: \"kubernetes.io/projected/04b1d7c5-4d8e-4962-b396-855adb2605d7-kube-api-access-vzb66\") pod \"watcher-api-0\" (UID: \"04b1d7c5-4d8e-4962-b396-855adb2605d7\") " pod="openstack/watcher-api-0" Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.050983 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.117312 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.130883 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.558746 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.606444 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"04b1d7c5-4d8e-4962-b396-855adb2605d7","Type":"ContainerStarted","Data":"86ff741b805eccbde8e981e456e05ca66696292ab95befd16725c85b365c3b19"} Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.750987 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.776441 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.893297 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e5e751a-8226-4445-87ff-2347c603df7c" path="/var/lib/kubelet/pods/1e5e751a-8226-4445-87ff-2347c603df7c/volumes" Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.908506 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c1a3da-b8c3-4825-ba1e-c6bbecff953e" path="/var/lib/kubelet/pods/c9c1a3da-b8c3-4825-ba1e-c6bbecff953e/volumes" Feb 20 16:53:12 crc kubenswrapper[4697]: I0220 16:53:12.909891 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d9f5b2-5617-4e75-97f2-f00513a5be67" path="/var/lib/kubelet/pods/f0d9f5b2-5617-4e75-97f2-f00513a5be67/volumes" Feb 20 16:53:13 crc kubenswrapper[4697]: I0220 16:53:13.617312 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"14b3fb67-823a-4da5-a42d-27745717ba8b","Type":"ContainerStarted","Data":"60596fdf772289cfc1ec697a522ebe3462d268b8f7bda45179aea2d9750b4c32"} Feb 20 16:53:13 crc kubenswrapper[4697]: I0220 16:53:13.617359 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"14b3fb67-823a-4da5-a42d-27745717ba8b","Type":"ContainerStarted","Data":"a3fd461dacb39c3d408af427824a423bbdd3e66191469f546c47350485b5e356"} Feb 20 16:53:13 crc kubenswrapper[4697]: I0220 16:53:13.619886 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"04b1d7c5-4d8e-4962-b396-855adb2605d7","Type":"ContainerStarted","Data":"5640107ca182f263f5bf72639b25b224a922cfea118b74834b8790119b605768"} Feb 20 16:53:13 crc kubenswrapper[4697]: I0220 16:53:13.619923 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"04b1d7c5-4d8e-4962-b396-855adb2605d7","Type":"ContainerStarted","Data":"39fa84b706f9ae56c4f659c495441d7e44444d0ab108dcab89fc0067df18a598"} Feb 20 16:53:13 crc kubenswrapper[4697]: I0220 16:53:13.620089 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 20 16:53:13 crc kubenswrapper[4697]: I0220 16:53:13.622084 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"21f4378d-bb26-408d-9613-82246765639b","Type":"ContainerStarted","Data":"2614bc724e674bdc1b5f592f297773e34693ebbf36bd2a5fc7e501afd3725252"} Feb 20 16:53:13 crc kubenswrapper[4697]: I0220 16:53:13.622125 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"21f4378d-bb26-408d-9613-82246765639b","Type":"ContainerStarted","Data":"1a0e2bc6c80a68809c78f0f482cfa01a29c1f19ab73237ea3d34e8212fb09fbc"} Feb 20 16:53:13 crc kubenswrapper[4697]: I0220 16:53:13.643982 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.643961634 podStartE2EDuration="2.643961634s" podCreationTimestamp="2026-02-20 16:53:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:53:13.641372499 +0000 UTC m=+1301.421417907" watchObservedRunningTime="2026-02-20 16:53:13.643961634 +0000 UTC m=+1301.424007042" Feb 20 16:53:13 crc kubenswrapper[4697]: I0220 16:53:13.673069 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.67304848 podStartE2EDuration="2.67304848s" podCreationTimestamp="2026-02-20 16:53:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:53:13.668917446 +0000 UTC m=+1301.448962854" watchObservedRunningTime="2026-02-20 16:53:13.67304848 +0000 UTC m=+1301.453093888" Feb 20 16:53:13 crc kubenswrapper[4697]: I0220 16:53:13.692397 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.6923769589999997 podStartE2EDuration="2.692376959s" podCreationTimestamp="2026-02-20 16:53:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:53:13.686665854 +0000 UTC m=+1301.466711262" watchObservedRunningTime="2026-02-20 16:53:13.692376959 +0000 UTC m=+1301.472422377" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.199216 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.244980 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-sg-core-conf-yaml\") pod \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.245091 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-combined-ca-bundle\") pod \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.245152 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-run-httpd\") pod \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.245177 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92vz9\" (UniqueName: \"kubernetes.io/projected/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-kube-api-access-92vz9\") pod \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.245250 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-log-httpd\") pod \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.245275 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-config-data\") pod \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.245295 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-scripts\") pod \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\" (UID: \"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5\") " Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.245854 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" (UID: "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.246047 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" (UID: "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.246456 4697 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.246477 4697 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.251962 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-kube-api-access-92vz9" (OuterVolumeSpecName: "kube-api-access-92vz9") pod "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" (UID: "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5"). InnerVolumeSpecName "kube-api-access-92vz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.288154 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-scripts" (OuterVolumeSpecName: "scripts") pod "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" (UID: "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.298694 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" (UID: "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.350317 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92vz9\" (UniqueName: \"kubernetes.io/projected/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-kube-api-access-92vz9\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.350349 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.350358 4697 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.358591 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-config-data" (OuterVolumeSpecName: "config-data") pod "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" (UID: "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.380997 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" (UID: "7e1c0579-6d7c-40f9-93b6-f4dfef660bd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.452391 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.452714 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.639708 4697 generic.go:334] "Generic (PLEG): container finished" podID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerID="bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1" exitCode=0 Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.639757 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5","Type":"ContainerDied","Data":"bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1"} Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.639788 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e1c0579-6d7c-40f9-93b6-f4dfef660bd5","Type":"ContainerDied","Data":"35db779a6315140dd7af6ca0062d063da3b05b114112c09a36f81a483080817a"} Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.639808 4697 scope.go:117] "RemoveContainer" containerID="677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.639947 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.661769 4697 scope.go:117] "RemoveContainer" containerID="da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.683547 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.684769 4697 scope.go:117] "RemoveContainer" containerID="d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.706041 4697 scope.go:117] "RemoveContainer" containerID="bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.708086 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.737798 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:53:15 crc kubenswrapper[4697]: E0220 16:53:15.738199 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="sg-core" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.738217 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="sg-core" Feb 20 16:53:15 crc kubenswrapper[4697]: E0220 16:53:15.738228 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="proxy-httpd" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.738233 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="proxy-httpd" Feb 20 16:53:15 crc kubenswrapper[4697]: E0220 16:53:15.738255 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="ceilometer-central-agent" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.738262 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="ceilometer-central-agent" Feb 20 16:53:15 crc kubenswrapper[4697]: E0220 16:53:15.738290 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="ceilometer-notification-agent" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.738296 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="ceilometer-notification-agent" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.738609 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="ceilometer-notification-agent" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.738626 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="ceilometer-central-agent" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.738636 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="sg-core" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.738646 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" containerName="proxy-httpd" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.743460 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.746165 4697 scope.go:117] "RemoveContainer" containerID="677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.750269 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 16:53:15 crc kubenswrapper[4697]: E0220 16:53:15.764784 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7\": container with ID starting with 677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7 not found: ID does not exist" containerID="677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.771776 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7"} err="failed to get container status \"677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7\": rpc error: code = NotFound desc = could not find container \"677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7\": container with ID starting with 677b3feed61d877f73a763f1eff09c3106c151007adb2ba33518aafdbe4e0fc7 not found: ID does not exist" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.772057 4697 scope.go:117] "RemoveContainer" containerID="da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.765642 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 16:53:15 crc kubenswrapper[4697]: E0220 16:53:15.772884 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4\": container with ID starting with da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4 not found: ID does not exist" containerID="da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.772933 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4"} err="failed to get container status \"da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4\": rpc error: code = NotFound desc = could not find container \"da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4\": container with ID starting with da4c6b862f97a86e5ab46f5bb8be52d8b9d5705042398d1341023d122a4a5de4 not found: ID does not exist" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.772969 4697 scope.go:117] "RemoveContainer" containerID="d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d" Feb 20 16:53:15 crc kubenswrapper[4697]: E0220 16:53:15.775799 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d\": container with ID starting with d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d not found: ID does not exist" containerID="d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.775832 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d"} err="failed to get container status \"d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d\": rpc error: code = NotFound desc = could not find container \"d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d\": container with ID starting with d20529af0f5dac348fdaa6439a422fbbc39c8c8f39ca6a8c95314b66aecdf82d not found: ID does not exist" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.775863 4697 scope.go:117] "RemoveContainer" containerID="bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1" Feb 20 16:53:15 crc kubenswrapper[4697]: E0220 16:53:15.777790 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1\": container with ID starting with bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1 not found: ID does not exist" containerID="bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.781492 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1"} err="failed to get container status \"bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1\": rpc error: code = NotFound desc = could not find container \"bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1\": container with ID starting with bdedd17ffa6c9d5924e67f49ae9c5a4cc64612ca179fdda1b79fcc74abcd54c1 not found: ID does not exist" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.787215 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.860491 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/936dfb05-cca9-40aa-8106-9399dc9a287b-log-httpd\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.860565 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.860671 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/936dfb05-cca9-40aa-8106-9399dc9a287b-run-httpd\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.860800 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-scripts\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.860847 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.860967 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4bx6\" (UniqueName: \"kubernetes.io/projected/936dfb05-cca9-40aa-8106-9399dc9a287b-kube-api-access-r4bx6\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.861026 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-config-data\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.962557 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-scripts\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.962618 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.962695 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4bx6\" (UniqueName: \"kubernetes.io/projected/936dfb05-cca9-40aa-8106-9399dc9a287b-kube-api-access-r4bx6\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.962729 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-config-data\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.962807 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/936dfb05-cca9-40aa-8106-9399dc9a287b-log-httpd\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.962868 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.962888 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/936dfb05-cca9-40aa-8106-9399dc9a287b-run-httpd\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.963605 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/936dfb05-cca9-40aa-8106-9399dc9a287b-log-httpd\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.963844 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/936dfb05-cca9-40aa-8106-9399dc9a287b-run-httpd\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.967979 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.968056 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-scripts\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.968564 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-config-data\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.971565 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:15 crc kubenswrapper[4697]: I0220 16:53:15.989863 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4bx6\" (UniqueName: \"kubernetes.io/projected/936dfb05-cca9-40aa-8106-9399dc9a287b-kube-api-access-r4bx6\") pod \"ceilometer-0\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " pod="openstack/ceilometer-0" Feb 20 16:53:16 crc kubenswrapper[4697]: I0220 16:53:16.120471 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:53:16 crc kubenswrapper[4697]: I0220 16:53:16.247888 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 20 16:53:16 crc kubenswrapper[4697]: I0220 16:53:16.595103 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:53:16 crc kubenswrapper[4697]: I0220 16:53:16.598199 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 16:53:16 crc kubenswrapper[4697]: I0220 16:53:16.652091 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"936dfb05-cca9-40aa-8106-9399dc9a287b","Type":"ContainerStarted","Data":"45d706022b1eb756c3889a10cb98afa7dda2f21225c4cb2b010ed3b4eaa86e4a"} Feb 20 16:53:16 crc kubenswrapper[4697]: I0220 16:53:16.726508 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:53:16 crc kubenswrapper[4697]: I0220 16:53:16.892758 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e1c0579-6d7c-40f9-93b6-f4dfef660bd5" path="/var/lib/kubelet/pods/7e1c0579-6d7c-40f9-93b6-f4dfef660bd5/volumes" Feb 20 16:53:17 crc kubenswrapper[4697]: I0220 16:53:17.051720 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 20 16:53:17 crc kubenswrapper[4697]: I0220 16:53:17.131196 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 20 16:53:17 crc kubenswrapper[4697]: I0220 16:53:17.662338 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"936dfb05-cca9-40aa-8106-9399dc9a287b","Type":"ContainerStarted","Data":"0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22"} Feb 20 16:53:17 crc kubenswrapper[4697]: I0220 16:53:17.662379 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"936dfb05-cca9-40aa-8106-9399dc9a287b","Type":"ContainerStarted","Data":"c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f"} Feb 20 16:53:18 crc kubenswrapper[4697]: I0220 16:53:18.673743 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"936dfb05-cca9-40aa-8106-9399dc9a287b","Type":"ContainerStarted","Data":"c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57"} Feb 20 16:53:20 crc kubenswrapper[4697]: I0220 16:53:20.692805 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="ceilometer-central-agent" containerID="cri-o://c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f" gracePeriod=30 Feb 20 16:53:20 crc kubenswrapper[4697]: I0220 16:53:20.693159 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="proxy-httpd" containerID="cri-o://d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5" gracePeriod=30 Feb 20 16:53:20 crc kubenswrapper[4697]: I0220 16:53:20.692830 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"936dfb05-cca9-40aa-8106-9399dc9a287b","Type":"ContainerStarted","Data":"d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5"} Feb 20 16:53:20 crc kubenswrapper[4697]: I0220 16:53:20.693211 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="ceilometer-notification-agent" containerID="cri-o://0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22" gracePeriod=30 Feb 20 16:53:20 crc kubenswrapper[4697]: I0220 16:53:20.693249 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="sg-core" containerID="cri-o://c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57" gracePeriod=30 Feb 20 16:53:20 crc kubenswrapper[4697]: I0220 16:53:20.693262 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 16:53:20 crc kubenswrapper[4697]: I0220 16:53:20.727788 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.454011416 podStartE2EDuration="5.727772464s" podCreationTimestamp="2026-02-20 16:53:15 +0000 UTC" firstStartedPulling="2026-02-20 16:53:16.597969475 +0000 UTC m=+1304.378014883" lastFinishedPulling="2026-02-20 16:53:19.871730523 +0000 UTC m=+1307.651775931" observedRunningTime="2026-02-20 16:53:20.726274706 +0000 UTC m=+1308.506320114" watchObservedRunningTime="2026-02-20 16:53:20.727772464 +0000 UTC m=+1308.507817862" Feb 20 16:53:21 crc kubenswrapper[4697]: I0220 16:53:21.703711 4697 generic.go:334] "Generic (PLEG): container finished" podID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerID="d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5" exitCode=0 Feb 20 16:53:21 crc kubenswrapper[4697]: I0220 16:53:21.703962 4697 generic.go:334] "Generic (PLEG): container finished" podID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerID="c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57" exitCode=2 Feb 20 16:53:21 crc kubenswrapper[4697]: I0220 16:53:21.703971 4697 generic.go:334] "Generic (PLEG): container finished" podID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerID="0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22" exitCode=0 Feb 20 16:53:21 crc kubenswrapper[4697]: I0220 16:53:21.703791 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"936dfb05-cca9-40aa-8106-9399dc9a287b","Type":"ContainerDied","Data":"d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5"} Feb 20 16:53:21 crc kubenswrapper[4697]: I0220 16:53:21.704003 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"936dfb05-cca9-40aa-8106-9399dc9a287b","Type":"ContainerDied","Data":"c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57"} Feb 20 16:53:21 crc kubenswrapper[4697]: I0220 16:53:21.704019 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"936dfb05-cca9-40aa-8106-9399dc9a287b","Type":"ContainerDied","Data":"0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22"} Feb 20 16:53:22 crc kubenswrapper[4697]: I0220 16:53:22.052108 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 20 16:53:22 crc kubenswrapper[4697]: I0220 16:53:22.060123 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 20 16:53:22 crc kubenswrapper[4697]: I0220 16:53:22.118727 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 20 16:53:22 crc kubenswrapper[4697]: I0220 16:53:22.131471 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 20 16:53:22 crc kubenswrapper[4697]: I0220 16:53:22.149756 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 20 16:53:22 crc kubenswrapper[4697]: I0220 16:53:22.158087 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 20 16:53:22 crc kubenswrapper[4697]: I0220 16:53:22.712387 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 20 16:53:22 crc kubenswrapper[4697]: I0220 16:53:22.719350 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 20 16:53:22 crc kubenswrapper[4697]: I0220 16:53:22.797363 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 20 16:53:22 crc kubenswrapper[4697]: I0220 16:53:22.819887 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 20 16:53:24 crc kubenswrapper[4697]: I0220 16:53:24.732383 4697 generic.go:334] "Generic (PLEG): container finished" podID="25b9be1a-ce8d-40bb-978c-f2213d6175d2" containerID="a7e9786095ab3c9d0b955842128535d2ce274ccfb0d497ab87627ef4fad4a57c" exitCode=0 Feb 20 16:53:24 crc kubenswrapper[4697]: I0220 16:53:24.732519 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4l2dn" event={"ID":"25b9be1a-ce8d-40bb-978c-f2213d6175d2","Type":"ContainerDied","Data":"a7e9786095ab3c9d0b955842128535d2ce274ccfb0d497ab87627ef4fad4a57c"} Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.389728 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.465841 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-scripts\") pod \"936dfb05-cca9-40aa-8106-9399dc9a287b\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.465957 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4bx6\" (UniqueName: \"kubernetes.io/projected/936dfb05-cca9-40aa-8106-9399dc9a287b-kube-api-access-r4bx6\") pod \"936dfb05-cca9-40aa-8106-9399dc9a287b\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.466019 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/936dfb05-cca9-40aa-8106-9399dc9a287b-log-httpd\") pod \"936dfb05-cca9-40aa-8106-9399dc9a287b\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.466044 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/936dfb05-cca9-40aa-8106-9399dc9a287b-run-httpd\") pod \"936dfb05-cca9-40aa-8106-9399dc9a287b\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.466106 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-sg-core-conf-yaml\") pod \"936dfb05-cca9-40aa-8106-9399dc9a287b\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.466191 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-combined-ca-bundle\") pod \"936dfb05-cca9-40aa-8106-9399dc9a287b\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.466395 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/936dfb05-cca9-40aa-8106-9399dc9a287b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "936dfb05-cca9-40aa-8106-9399dc9a287b" (UID: "936dfb05-cca9-40aa-8106-9399dc9a287b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.466646 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/936dfb05-cca9-40aa-8106-9399dc9a287b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "936dfb05-cca9-40aa-8106-9399dc9a287b" (UID: "936dfb05-cca9-40aa-8106-9399dc9a287b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.466921 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-config-data\") pod \"936dfb05-cca9-40aa-8106-9399dc9a287b\" (UID: \"936dfb05-cca9-40aa-8106-9399dc9a287b\") " Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.467360 4697 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/936dfb05-cca9-40aa-8106-9399dc9a287b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.467374 4697 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/936dfb05-cca9-40aa-8106-9399dc9a287b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.473572 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-scripts" (OuterVolumeSpecName: "scripts") pod "936dfb05-cca9-40aa-8106-9399dc9a287b" (UID: "936dfb05-cca9-40aa-8106-9399dc9a287b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.473616 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/936dfb05-cca9-40aa-8106-9399dc9a287b-kube-api-access-r4bx6" (OuterVolumeSpecName: "kube-api-access-r4bx6") pod "936dfb05-cca9-40aa-8106-9399dc9a287b" (UID: "936dfb05-cca9-40aa-8106-9399dc9a287b"). InnerVolumeSpecName "kube-api-access-r4bx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.513132 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "936dfb05-cca9-40aa-8106-9399dc9a287b" (UID: "936dfb05-cca9-40aa-8106-9399dc9a287b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.561752 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "936dfb05-cca9-40aa-8106-9399dc9a287b" (UID: "936dfb05-cca9-40aa-8106-9399dc9a287b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.569761 4697 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.569792 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.569802 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.569812 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4bx6\" (UniqueName: \"kubernetes.io/projected/936dfb05-cca9-40aa-8106-9399dc9a287b-kube-api-access-r4bx6\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.578868 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-config-data" (OuterVolumeSpecName: "config-data") pod "936dfb05-cca9-40aa-8106-9399dc9a287b" (UID: "936dfb05-cca9-40aa-8106-9399dc9a287b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.672831 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/936dfb05-cca9-40aa-8106-9399dc9a287b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.744768 4697 generic.go:334] "Generic (PLEG): container finished" podID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerID="c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f" exitCode=0 Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.744815 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.744854 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"936dfb05-cca9-40aa-8106-9399dc9a287b","Type":"ContainerDied","Data":"c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f"} Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.744901 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"936dfb05-cca9-40aa-8106-9399dc9a287b","Type":"ContainerDied","Data":"45d706022b1eb756c3889a10cb98afa7dda2f21225c4cb2b010ed3b4eaa86e4a"} Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.744927 4697 scope.go:117] "RemoveContainer" containerID="d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.771872 4697 scope.go:117] "RemoveContainer" containerID="c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.806037 4697 scope.go:117] "RemoveContainer" containerID="0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.808287 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.822116 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.840644 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:53:25 crc kubenswrapper[4697]: E0220 16:53:25.841412 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="ceilometer-notification-agent" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.841431 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="ceilometer-notification-agent" Feb 20 16:53:25 crc kubenswrapper[4697]: E0220 16:53:25.841465 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="ceilometer-central-agent" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.841472 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="ceilometer-central-agent" Feb 20 16:53:25 crc kubenswrapper[4697]: E0220 16:53:25.841503 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="proxy-httpd" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.841511 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="proxy-httpd" Feb 20 16:53:25 crc kubenswrapper[4697]: E0220 16:53:25.841534 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="sg-core" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.841540 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="sg-core" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.841903 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="sg-core" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.841932 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="proxy-httpd" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.841953 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="ceilometer-central-agent" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.841970 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" containerName="ceilometer-notification-agent" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.843578 4697 scope.go:117] "RemoveContainer" containerID="c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.844954 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.847772 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.849124 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.887797 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-scripts\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.887853 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fecc4170-b5a7-4f50-b0c5-99408fae97a8-log-httpd\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.887902 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fecc4170-b5a7-4f50-b0c5-99408fae97a8-run-httpd\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.888013 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5pdl\" (UniqueName: \"kubernetes.io/projected/fecc4170-b5a7-4f50-b0c5-99408fae97a8-kube-api-access-n5pdl\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.888175 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.888201 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.888250 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-config-data\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.895121 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.915308 4697 scope.go:117] "RemoveContainer" containerID="d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5" Feb 20 16:53:25 crc kubenswrapper[4697]: E0220 16:53:25.916222 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5\": container with ID starting with d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5 not found: ID does not exist" containerID="d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.916261 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5"} err="failed to get container status \"d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5\": rpc error: code = NotFound desc = could not find container \"d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5\": container with ID starting with d100a8f5a638869398e295b74013901c482927b4549418cdfa2d1659671baef5 not found: ID does not exist" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.916287 4697 scope.go:117] "RemoveContainer" containerID="c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57" Feb 20 16:53:25 crc kubenswrapper[4697]: E0220 16:53:25.916670 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57\": container with ID starting with c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57 not found: ID does not exist" containerID="c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.916693 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57"} err="failed to get container status \"c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57\": rpc error: code = NotFound desc = could not find container \"c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57\": container with ID starting with c7eabb04ff01350deac1bb0e366e7103fc0a487c3b7bb1672fe221dca2368f57 not found: ID does not exist" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.916706 4697 scope.go:117] "RemoveContainer" containerID="0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22" Feb 20 16:53:25 crc kubenswrapper[4697]: E0220 16:53:25.917084 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22\": container with ID starting with 0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22 not found: ID does not exist" containerID="0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.917108 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22"} err="failed to get container status \"0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22\": rpc error: code = NotFound desc = could not find container \"0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22\": container with ID starting with 0f30855f1c1fc9299428f1e318c69fecb48b0ee3a26325e5edfcb5335f417a22 not found: ID does not exist" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.917126 4697 scope.go:117] "RemoveContainer" containerID="c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f" Feb 20 16:53:25 crc kubenswrapper[4697]: E0220 16:53:25.917670 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f\": container with ID starting with c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f not found: ID does not exist" containerID="c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.917688 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f"} err="failed to get container status \"c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f\": rpc error: code = NotFound desc = could not find container \"c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f\": container with ID starting with c0c9470e77651fbbde01e891c6d6d39213909fb9011dd896e50dab072120fb6f not found: ID does not exist" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.989608 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.989805 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.989923 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-config-data\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.990090 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-scripts\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.990188 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fecc4170-b5a7-4f50-b0c5-99408fae97a8-log-httpd\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.990282 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fecc4170-b5a7-4f50-b0c5-99408fae97a8-run-httpd\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.990353 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5pdl\" (UniqueName: \"kubernetes.io/projected/fecc4170-b5a7-4f50-b0c5-99408fae97a8-kube-api-access-n5pdl\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.990894 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fecc4170-b5a7-4f50-b0c5-99408fae97a8-run-httpd\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:25 crc kubenswrapper[4697]: I0220 16:53:25.992147 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fecc4170-b5a7-4f50-b0c5-99408fae97a8-log-httpd\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.000563 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-scripts\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.001020 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-config-data\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.001055 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.007943 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.013169 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5pdl\" (UniqueName: \"kubernetes.io/projected/fecc4170-b5a7-4f50-b0c5-99408fae97a8-kube-api-access-n5pdl\") pod \"ceilometer-0\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " pod="openstack/ceilometer-0" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.154192 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.189664 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.195995 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-config-data\") pod \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.196162 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-scripts\") pod \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.196301 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb4mx\" (UniqueName: \"kubernetes.io/projected/25b9be1a-ce8d-40bb-978c-f2213d6175d2-kube-api-access-qb4mx\") pod \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.196358 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-combined-ca-bundle\") pod \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\" (UID: \"25b9be1a-ce8d-40bb-978c-f2213d6175d2\") " Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.204601 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-scripts" (OuterVolumeSpecName: "scripts") pod "25b9be1a-ce8d-40bb-978c-f2213d6175d2" (UID: "25b9be1a-ce8d-40bb-978c-f2213d6175d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.208481 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b9be1a-ce8d-40bb-978c-f2213d6175d2-kube-api-access-qb4mx" (OuterVolumeSpecName: "kube-api-access-qb4mx") pod "25b9be1a-ce8d-40bb-978c-f2213d6175d2" (UID: "25b9be1a-ce8d-40bb-978c-f2213d6175d2"). InnerVolumeSpecName "kube-api-access-qb4mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.234569 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25b9be1a-ce8d-40bb-978c-f2213d6175d2" (UID: "25b9be1a-ce8d-40bb-978c-f2213d6175d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.239964 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-config-data" (OuterVolumeSpecName: "config-data") pod "25b9be1a-ce8d-40bb-978c-f2213d6175d2" (UID: "25b9be1a-ce8d-40bb-978c-f2213d6175d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.299536 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.299570 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.299583 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb4mx\" (UniqueName: \"kubernetes.io/projected/25b9be1a-ce8d-40bb-978c-f2213d6175d2-kube-api-access-qb4mx\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.299615 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b9be1a-ce8d-40bb-978c-f2213d6175d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.657456 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:53:26 crc kubenswrapper[4697]: W0220 16:53:26.666036 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfecc4170_b5a7_4f50_b0c5_99408fae97a8.slice/crio-08002d44bcc7aaa48089e460d56ac48d089639b7b2ebaffe531fbf5c69c841f7 WatchSource:0}: Error finding container 08002d44bcc7aaa48089e460d56ac48d089639b7b2ebaffe531fbf5c69c841f7: Status 404 returned error can't find the container with id 08002d44bcc7aaa48089e460d56ac48d089639b7b2ebaffe531fbf5c69c841f7 Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.763617 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4l2dn" event={"ID":"25b9be1a-ce8d-40bb-978c-f2213d6175d2","Type":"ContainerDied","Data":"bc406e40535a037b143f206a14129dbada592e999aa4a7017a503e65d855eaaf"} Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.763668 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc406e40535a037b143f206a14129dbada592e999aa4a7017a503e65d855eaaf" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.763634 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4l2dn" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.766503 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fecc4170-b5a7-4f50-b0c5-99408fae97a8","Type":"ContainerStarted","Data":"08002d44bcc7aaa48089e460d56ac48d089639b7b2ebaffe531fbf5c69c841f7"} Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.891253 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="936dfb05-cca9-40aa-8106-9399dc9a287b" path="/var/lib/kubelet/pods/936dfb05-cca9-40aa-8106-9399dc9a287b/volumes" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.892058 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 16:53:26 crc kubenswrapper[4697]: E0220 16:53:26.892421 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b9be1a-ce8d-40bb-978c-f2213d6175d2" containerName="nova-cell0-conductor-db-sync" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.892438 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b9be1a-ce8d-40bb-978c-f2213d6175d2" containerName="nova-cell0-conductor-db-sync" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.892670 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b9be1a-ce8d-40bb-978c-f2213d6175d2" containerName="nova-cell0-conductor-db-sync" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.893433 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.893680 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.896076 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2rbhw" Feb 20 16:53:26 crc kubenswrapper[4697]: I0220 16:53:26.896459 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.129916 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a29103e-4075-486e-8107-34b4a75352cc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2a29103e-4075-486e-8107-34b4a75352cc\") " pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.130704 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk5td\" (UniqueName: \"kubernetes.io/projected/2a29103e-4075-486e-8107-34b4a75352cc-kube-api-access-xk5td\") pod \"nova-cell0-conductor-0\" (UID: \"2a29103e-4075-486e-8107-34b4a75352cc\") " pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.130862 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a29103e-4075-486e-8107-34b4a75352cc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2a29103e-4075-486e-8107-34b4a75352cc\") " pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.232586 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a29103e-4075-486e-8107-34b4a75352cc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2a29103e-4075-486e-8107-34b4a75352cc\") " pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.232749 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk5td\" (UniqueName: \"kubernetes.io/projected/2a29103e-4075-486e-8107-34b4a75352cc-kube-api-access-xk5td\") pod \"nova-cell0-conductor-0\" (UID: \"2a29103e-4075-486e-8107-34b4a75352cc\") " pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.232802 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a29103e-4075-486e-8107-34b4a75352cc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2a29103e-4075-486e-8107-34b4a75352cc\") " pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.240297 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a29103e-4075-486e-8107-34b4a75352cc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2a29103e-4075-486e-8107-34b4a75352cc\") " pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.246468 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a29103e-4075-486e-8107-34b4a75352cc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2a29103e-4075-486e-8107-34b4a75352cc\") " pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.251550 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk5td\" (UniqueName: \"kubernetes.io/projected/2a29103e-4075-486e-8107-34b4a75352cc-kube-api-access-xk5td\") pod \"nova-cell0-conductor-0\" (UID: \"2a29103e-4075-486e-8107-34b4a75352cc\") " pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.532103 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.776514 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fecc4170-b5a7-4f50-b0c5-99408fae97a8","Type":"ContainerStarted","Data":"717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931"} Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.776878 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fecc4170-b5a7-4f50-b0c5-99408fae97a8","Type":"ContainerStarted","Data":"f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182"} Feb 20 16:53:27 crc kubenswrapper[4697]: I0220 16:53:27.969380 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 16:53:28 crc kubenswrapper[4697]: I0220 16:53:28.787044 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2a29103e-4075-486e-8107-34b4a75352cc","Type":"ContainerStarted","Data":"ac79a8c7a8adcec757225d270d8faee563447627722a566dfc31847b4c050eb4"} Feb 20 16:53:28 crc kubenswrapper[4697]: I0220 16:53:28.787564 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2a29103e-4075-486e-8107-34b4a75352cc","Type":"ContainerStarted","Data":"6016d41983a5fd4f147affbd38b313a399c4a1009d6fb1bb80e5ed0b130ed59b"} Feb 20 16:53:28 crc kubenswrapper[4697]: I0220 16:53:28.787759 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:28 crc kubenswrapper[4697]: I0220 16:53:28.789924 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fecc4170-b5a7-4f50-b0c5-99408fae97a8","Type":"ContainerStarted","Data":"eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346"} Feb 20 16:53:28 crc kubenswrapper[4697]: I0220 16:53:28.811823 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.8117901400000003 podStartE2EDuration="2.81179014s" podCreationTimestamp="2026-02-20 16:53:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:53:28.801473769 +0000 UTC m=+1316.581519177" watchObservedRunningTime="2026-02-20 16:53:28.81179014 +0000 UTC m=+1316.591835548" Feb 20 16:53:29 crc kubenswrapper[4697]: I0220 16:53:29.803547 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fecc4170-b5a7-4f50-b0c5-99408fae97a8","Type":"ContainerStarted","Data":"2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463"} Feb 20 16:53:29 crc kubenswrapper[4697]: I0220 16:53:29.804161 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 16:53:29 crc kubenswrapper[4697]: I0220 16:53:29.832087 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9581191580000001 podStartE2EDuration="4.832069174s" podCreationTimestamp="2026-02-20 16:53:25 +0000 UTC" firstStartedPulling="2026-02-20 16:53:26.668865952 +0000 UTC m=+1314.448911380" lastFinishedPulling="2026-02-20 16:53:29.542815978 +0000 UTC m=+1317.322861396" observedRunningTime="2026-02-20 16:53:29.826407751 +0000 UTC m=+1317.606453159" watchObservedRunningTime="2026-02-20 16:53:29.832069174 +0000 UTC m=+1317.612114582" Feb 20 16:53:31 crc kubenswrapper[4697]: I0220 16:53:31.185267 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:53:31 crc kubenswrapper[4697]: I0220 16:53:31.185658 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:53:31 crc kubenswrapper[4697]: I0220 16:53:31.185711 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:53:31 crc kubenswrapper[4697]: I0220 16:53:31.186583 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18b63bf23bfaf1519d7cd30ced77d1ca85c9b60f2bab25b83e9d358614cbd28d"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 16:53:31 crc kubenswrapper[4697]: I0220 16:53:31.186657 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://18b63bf23bfaf1519d7cd30ced77d1ca85c9b60f2bab25b83e9d358614cbd28d" gracePeriod=600 Feb 20 16:53:31 crc kubenswrapper[4697]: I0220 16:53:31.825775 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="18b63bf23bfaf1519d7cd30ced77d1ca85c9b60f2bab25b83e9d358614cbd28d" exitCode=0 Feb 20 16:53:31 crc kubenswrapper[4697]: I0220 16:53:31.825876 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"18b63bf23bfaf1519d7cd30ced77d1ca85c9b60f2bab25b83e9d358614cbd28d"} Feb 20 16:53:31 crc kubenswrapper[4697]: I0220 16:53:31.826266 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382"} Feb 20 16:53:31 crc kubenswrapper[4697]: I0220 16:53:31.826300 4697 scope.go:117] "RemoveContainer" containerID="7914ab01639074e62ef59025fd1a7ee59be0ddca912a97f841ccc347f5312e8e" Feb 20 16:53:37 crc kubenswrapper[4697]: I0220 16:53:37.564803 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.023924 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-nmwj6"] Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.025938 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.027912 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.028911 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.040361 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-nmwj6"] Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.177829 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.179073 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.180198 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vppr2\" (UniqueName: \"kubernetes.io/projected/7eebb045-e1b7-4b9d-9d41-9485997d1743-kube-api-access-vppr2\") pod \"nova-cell0-cell-mapping-nmwj6\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.180508 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nmwj6\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.180977 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-config-data\") pod \"nova-cell0-cell-mapping-nmwj6\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.181095 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-scripts\") pod \"nova-cell0-cell-mapping-nmwj6\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.181651 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.192639 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.284985 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-config-data\") pod \"nova-cell0-cell-mapping-nmwj6\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.285263 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-scripts\") pod \"nova-cell0-cell-mapping-nmwj6\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.285394 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.285555 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqr74\" (UniqueName: \"kubernetes.io/projected/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-kube-api-access-wqr74\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.285647 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vppr2\" (UniqueName: \"kubernetes.io/projected/7eebb045-e1b7-4b9d-9d41-9485997d1743-kube-api-access-vppr2\") pod \"nova-cell0-cell-mapping-nmwj6\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.285716 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.285816 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nmwj6\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.291010 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.294355 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.295610 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-scripts\") pod \"nova-cell0-cell-mapping-nmwj6\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.301826 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.312056 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nmwj6\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.344792 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-config-data\") pod \"nova-cell0-cell-mapping-nmwj6\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.347292 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.369250 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vppr2\" (UniqueName: \"kubernetes.io/projected/7eebb045-e1b7-4b9d-9d41-9485997d1743-kube-api-access-vppr2\") pod \"nova-cell0-cell-mapping-nmwj6\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.382821 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.384424 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.386922 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k784\" (UniqueName: \"kubernetes.io/projected/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-kube-api-access-4k784\") pod \"nova-api-0\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.386965 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xkx4\" (UniqueName: \"kubernetes.io/projected/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-kube-api-access-7xkx4\") pod \"nova-metadata-0\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.386986 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.387025 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.387054 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-config-data\") pod \"nova-api-0\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.387085 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-logs\") pod \"nova-api-0\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.387099 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.387147 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqr74\" (UniqueName: \"kubernetes.io/projected/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-kube-api-access-wqr74\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.387162 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-logs\") pod \"nova-metadata-0\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.387186 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-config-data\") pod \"nova-metadata-0\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.387206 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.391896 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.401202 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.401273 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.401783 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.436607 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqr74\" (UniqueName: \"kubernetes.io/projected/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-kube-api-access-wqr74\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.490573 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xkx4\" (UniqueName: \"kubernetes.io/projected/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-kube-api-access-7xkx4\") pod \"nova-metadata-0\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.490638 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.490717 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-config-data\") pod \"nova-api-0\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.490764 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.490784 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-logs\") pod \"nova-api-0\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.490864 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-logs\") pod \"nova-metadata-0\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.490904 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-config-data\") pod \"nova-metadata-0\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.491007 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k784\" (UniqueName: \"kubernetes.io/projected/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-kube-api-access-4k784\") pod \"nova-api-0\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.493946 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-logs\") pod \"nova-api-0\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.499039 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-logs\") pod \"nova-metadata-0\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.520474 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.529470 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.530780 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.556567 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.557975 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xkx4\" (UniqueName: \"kubernetes.io/projected/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-kube-api-access-7xkx4\") pod \"nova-metadata-0\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.564955 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-config-data\") pod \"nova-metadata-0\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.565119 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.567100 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k784\" (UniqueName: \"kubernetes.io/projected/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-kube-api-access-4k784\") pod \"nova-api-0\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.580500 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.580757 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-config-data\") pod \"nova-api-0\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.599268 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.614562 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.616278 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cdd54648c-vgxw7"] Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.648049 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.650713 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.651223 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.680474 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cdd54648c-vgxw7"] Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.702653 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-dns-svc\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.702713 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-config-data\") pod \"nova-scheduler-0\" (UID: \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.702792 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkg7q\" (UniqueName: \"kubernetes.io/projected/f858e9b3-afae-4bbd-987e-9f22fe25270c-kube-api-access-pkg7q\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.702864 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-dns-swift-storage-0\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.702886 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-config\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.702934 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.702955 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-ovsdbserver-nb\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.703002 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-ovsdbserver-sb\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.703054 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nrnw\" (UniqueName: \"kubernetes.io/projected/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-kube-api-access-2nrnw\") pod \"nova-scheduler-0\" (UID: \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.804626 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkg7q\" (UniqueName: \"kubernetes.io/projected/f858e9b3-afae-4bbd-987e-9f22fe25270c-kube-api-access-pkg7q\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.804703 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-dns-swift-storage-0\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.804728 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-config\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.804753 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.804771 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-ovsdbserver-nb\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.804804 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-ovsdbserver-sb\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.804834 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nrnw\" (UniqueName: \"kubernetes.io/projected/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-kube-api-access-2nrnw\") pod \"nova-scheduler-0\" (UID: \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.804896 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-dns-svc\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.804914 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-config-data\") pod \"nova-scheduler-0\" (UID: \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.806860 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-dns-swift-storage-0\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.807700 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-ovsdbserver-sb\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.808248 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-ovsdbserver-nb\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.808831 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-config\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.809589 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-dns-svc\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.825298 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.825761 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-config-data\") pod \"nova-scheduler-0\" (UID: \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.862134 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkg7q\" (UniqueName: \"kubernetes.io/projected/f858e9b3-afae-4bbd-987e-9f22fe25270c-kube-api-access-pkg7q\") pod \"dnsmasq-dns-5cdd54648c-vgxw7\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.869033 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nrnw\" (UniqueName: \"kubernetes.io/projected/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-kube-api-access-2nrnw\") pod \"nova-scheduler-0\" (UID: \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.961107 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 16:53:38 crc kubenswrapper[4697]: I0220 16:53:38.977230 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.167040 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.539071 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:53:39 crc kubenswrapper[4697]: W0220 16:53:39.544299 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd1fa9a9_3270_47ee_97a5_37cc4fbb9c1a.slice/crio-f8e43648488e7b29a4c2bd16d74a296662e1a0e046b1a0d47b0afdef5a569cb7 WatchSource:0}: Error finding container f8e43648488e7b29a4c2bd16d74a296662e1a0e046b1a0d47b0afdef5a569cb7: Status 404 returned error can't find the container with id f8e43648488e7b29a4c2bd16d74a296662e1a0e046b1a0d47b0afdef5a569cb7 Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.550356 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-nmwj6"] Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.560519 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.718848 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6bjzj"] Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.720632 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.723075 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.723227 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.730535 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6bjzj"] Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.753352 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6bjzj\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.753400 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-config-data\") pod \"nova-cell1-conductor-db-sync-6bjzj\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.753433 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-scripts\") pod \"nova-cell1-conductor-db-sync-6bjzj\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.753550 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xpgv\" (UniqueName: \"kubernetes.io/projected/33dd1dad-ee40-4520-9d04-56a1b69894a0-kube-api-access-2xpgv\") pod \"nova-cell1-conductor-db-sync-6bjzj\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.862400 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xpgv\" (UniqueName: \"kubernetes.io/projected/33dd1dad-ee40-4520-9d04-56a1b69894a0-kube-api-access-2xpgv\") pod \"nova-cell1-conductor-db-sync-6bjzj\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.862506 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6bjzj\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.862541 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-config-data\") pod \"nova-cell1-conductor-db-sync-6bjzj\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.862582 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-scripts\") pod \"nova-cell1-conductor-db-sync-6bjzj\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.870251 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-scripts\") pod \"nova-cell1-conductor-db-sync-6bjzj\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.870290 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6bjzj\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.870775 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-config-data\") pod \"nova-cell1-conductor-db-sync-6bjzj\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.883857 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xpgv\" (UniqueName: \"kubernetes.io/projected/33dd1dad-ee40-4520-9d04-56a1b69894a0-kube-api-access-2xpgv\") pod \"nova-cell1-conductor-db-sync-6bjzj\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.888955 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.897353 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cdd54648c-vgxw7"] Feb 20 16:53:39 crc kubenswrapper[4697]: W0220 16:53:39.904590 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3506e35_dd19_4836_ab0f_b2d86aaaa3cc.slice/crio-194ca8e56c130f90c09fc932972f5c49aa288f145791c32043ededd58e304467 WatchSource:0}: Error finding container 194ca8e56c130f90c09fc932972f5c49aa288f145791c32043ededd58e304467: Status 404 returned error can't find the container with id 194ca8e56c130f90c09fc932972f5c49aa288f145791c32043ededd58e304467 Feb 20 16:53:39 crc kubenswrapper[4697]: W0220 16:53:39.906834 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf858e9b3_afae_4bbd_987e_9f22fe25270c.slice/crio-502392984d5366bccef8b31783b484dd8701183aee67c7b2819423461d976e7c WatchSource:0}: Error finding container 502392984d5366bccef8b31783b484dd8701183aee67c7b2819423461d976e7c: Status 404 returned error can't find the container with id 502392984d5366bccef8b31783b484dd8701183aee67c7b2819423461d976e7c Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.959974 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a","Type":"ContainerStarted","Data":"f8e43648488e7b29a4c2bd16d74a296662e1a0e046b1a0d47b0afdef5a569cb7"} Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.970421 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nmwj6" event={"ID":"7eebb045-e1b7-4b9d-9d41-9485997d1743","Type":"ContainerStarted","Data":"1e28fbf0ae4c050426541d75380fcd78c64c5fbd2c2607160ec0e5a8313c38ea"} Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.970607 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nmwj6" event={"ID":"7eebb045-e1b7-4b9d-9d41-9485997d1743","Type":"ContainerStarted","Data":"3efddde434d594170f8361e4acd4544a2a40df83ee25ddcda71d36ea7d9d2a31"} Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.976008 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc","Type":"ContainerStarted","Data":"194ca8e56c130f90c09fc932972f5c49aa288f145791c32043ededd58e304467"} Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.980048 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e8ff77f1-cb68-4ab0-ade5-1d692fff2039","Type":"ContainerStarted","Data":"cbbbd63089b74a554bb6496439c5765ebb4c43fc644d34fabd0c95fb6db3db06"} Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.982118 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" event={"ID":"f858e9b3-afae-4bbd-987e-9f22fe25270c","Type":"ContainerStarted","Data":"502392984d5366bccef8b31783b484dd8701183aee67c7b2819423461d976e7c"} Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.984550 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"077aaa12-36a8-4d01-bb86-c68d1c7a41ea","Type":"ContainerStarted","Data":"6dde2667ec7cfe2833876b99bbc97560f0e0184cb3b756963d6aa1d747629525"} Feb 20 16:53:39 crc kubenswrapper[4697]: I0220 16:53:39.996340 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-nmwj6" podStartSLOduration=1.9963206420000001 podStartE2EDuration="1.996320642s" podCreationTimestamp="2026-02-20 16:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:53:39.985324054 +0000 UTC m=+1327.765369472" watchObservedRunningTime="2026-02-20 16:53:39.996320642 +0000 UTC m=+1327.776366050" Feb 20 16:53:40 crc kubenswrapper[4697]: I0220 16:53:40.054362 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:41 crc kubenswrapper[4697]: I0220 16:53:41.002698 4697 generic.go:334] "Generic (PLEG): container finished" podID="f858e9b3-afae-4bbd-987e-9f22fe25270c" containerID="2546ee47d878476dadc4177e8cb965174cf823b59bc79a2b8eb38fd123cb032a" exitCode=0 Feb 20 16:53:41 crc kubenswrapper[4697]: I0220 16:53:41.004535 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" event={"ID":"f858e9b3-afae-4bbd-987e-9f22fe25270c","Type":"ContainerDied","Data":"2546ee47d878476dadc4177e8cb965174cf823b59bc79a2b8eb38fd123cb032a"} Feb 20 16:53:41 crc kubenswrapper[4697]: I0220 16:53:41.784004 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:41 crc kubenswrapper[4697]: I0220 16:53:41.813408 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 16:53:42 crc kubenswrapper[4697]: I0220 16:53:42.984698 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6bjzj"] Feb 20 16:53:43 crc kubenswrapper[4697]: W0220 16:53:43.006441 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33dd1dad_ee40_4520_9d04_56a1b69894a0.slice/crio-501cd23b0426ab3caac8c705d63d3f8e2fdbf59ea0d08d13d1bda48f6a5e0e95 WatchSource:0}: Error finding container 501cd23b0426ab3caac8c705d63d3f8e2fdbf59ea0d08d13d1bda48f6a5e0e95: Status 404 returned error can't find the container with id 501cd23b0426ab3caac8c705d63d3f8e2fdbf59ea0d08d13d1bda48f6a5e0e95 Feb 20 16:53:43 crc kubenswrapper[4697]: I0220 16:53:43.035000 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc","Type":"ContainerStarted","Data":"cbb29ab5ea2659209e5663e93d75b97822989d723be8795a8bc02144fcfb6363"} Feb 20 16:53:43 crc kubenswrapper[4697]: I0220 16:53:43.041337 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a","Type":"ContainerStarted","Data":"f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be"} Feb 20 16:53:43 crc kubenswrapper[4697]: I0220 16:53:43.050716 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6bjzj" event={"ID":"33dd1dad-ee40-4520-9d04-56a1b69894a0","Type":"ContainerStarted","Data":"501cd23b0426ab3caac8c705d63d3f8e2fdbf59ea0d08d13d1bda48f6a5e0e95"} Feb 20 16:53:43 crc kubenswrapper[4697]: I0220 16:53:43.058646 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e8ff77f1-cb68-4ab0-ade5-1d692fff2039","Type":"ContainerStarted","Data":"91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879"} Feb 20 16:53:43 crc kubenswrapper[4697]: I0220 16:53:43.058728 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e8ff77f1-cb68-4ab0-ade5-1d692fff2039" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879" gracePeriod=30 Feb 20 16:53:43 crc kubenswrapper[4697]: I0220 16:53:43.064036 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" event={"ID":"f858e9b3-afae-4bbd-987e-9f22fe25270c","Type":"ContainerStarted","Data":"ff3885dc951d6b4c46150938d38477867fb56f3443262b2fdc9b8448f5e33595"} Feb 20 16:53:43 crc kubenswrapper[4697]: I0220 16:53:43.064661 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:43 crc kubenswrapper[4697]: I0220 16:53:43.065082 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.369265374 podStartE2EDuration="5.065062365s" podCreationTimestamp="2026-02-20 16:53:38 +0000 UTC" firstStartedPulling="2026-02-20 16:53:39.90687545 +0000 UTC m=+1327.686920858" lastFinishedPulling="2026-02-20 16:53:42.602672441 +0000 UTC m=+1330.382717849" observedRunningTime="2026-02-20 16:53:43.055848042 +0000 UTC m=+1330.835893450" watchObservedRunningTime="2026-02-20 16:53:43.065062365 +0000 UTC m=+1330.845107773" Feb 20 16:53:43 crc kubenswrapper[4697]: I0220 16:53:43.086005 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.726062127 podStartE2EDuration="5.085982474s" podCreationTimestamp="2026-02-20 16:53:38 +0000 UTC" firstStartedPulling="2026-02-20 16:53:39.214722245 +0000 UTC m=+1326.994767653" lastFinishedPulling="2026-02-20 16:53:42.574642592 +0000 UTC m=+1330.354688000" observedRunningTime="2026-02-20 16:53:43.080079505 +0000 UTC m=+1330.860124913" watchObservedRunningTime="2026-02-20 16:53:43.085982474 +0000 UTC m=+1330.866027882" Feb 20 16:53:43 crc kubenswrapper[4697]: I0220 16:53:43.121317 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" podStartSLOduration=5.121297808 podStartE2EDuration="5.121297808s" podCreationTimestamp="2026-02-20 16:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:53:43.10084388 +0000 UTC m=+1330.880889288" watchObservedRunningTime="2026-02-20 16:53:43.121297808 +0000 UTC m=+1330.901343206" Feb 20 16:53:43 crc kubenswrapper[4697]: I0220 16:53:43.529911 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:53:43 crc kubenswrapper[4697]: I0220 16:53:43.962186 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.082091 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a","Type":"ContainerStarted","Data":"fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3"} Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.087900 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6bjzj" event={"ID":"33dd1dad-ee40-4520-9d04-56a1b69894a0","Type":"ContainerStarted","Data":"2fab0bf3cef5f45dbbf1a0d28d2e66edcfc757a847d2de5005c363a251a621c0"} Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.090038 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"077aaa12-36a8-4d01-bb86-c68d1c7a41ea","Type":"ContainerStarted","Data":"fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497"} Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.090081 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"077aaa12-36a8-4d01-bb86-c68d1c7a41ea","Type":"ContainerStarted","Data":"fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff"} Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.090320 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="077aaa12-36a8-4d01-bb86-c68d1c7a41ea" containerName="nova-metadata-log" containerID="cri-o://fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff" gracePeriod=30 Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.090370 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="077aaa12-36a8-4d01-bb86-c68d1c7a41ea" containerName="nova-metadata-metadata" containerID="cri-o://fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497" gracePeriod=30 Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.113276 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.092010024 podStartE2EDuration="6.113257625s" podCreationTimestamp="2026-02-20 16:53:38 +0000 UTC" firstStartedPulling="2026-02-20 16:53:39.554765605 +0000 UTC m=+1327.334811013" lastFinishedPulling="2026-02-20 16:53:42.576013206 +0000 UTC m=+1330.356058614" observedRunningTime="2026-02-20 16:53:44.105369295 +0000 UTC m=+1331.885414713" watchObservedRunningTime="2026-02-20 16:53:44.113257625 +0000 UTC m=+1331.893303033" Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.122927 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-6bjzj" podStartSLOduration=5.122904049 podStartE2EDuration="5.122904049s" podCreationTimestamp="2026-02-20 16:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:53:44.119286707 +0000 UTC m=+1331.899332115" watchObservedRunningTime="2026-02-20 16:53:44.122904049 +0000 UTC m=+1331.902949457" Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.140648 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.171316969 podStartE2EDuration="6.140629357s" podCreationTimestamp="2026-02-20 16:53:38 +0000 UTC" firstStartedPulling="2026-02-20 16:53:39.604275717 +0000 UTC m=+1327.384321125" lastFinishedPulling="2026-02-20 16:53:42.573588105 +0000 UTC m=+1330.353633513" observedRunningTime="2026-02-20 16:53:44.136818611 +0000 UTC m=+1331.916864019" watchObservedRunningTime="2026-02-20 16:53:44.140629357 +0000 UTC m=+1331.920674755" Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.954410 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.998010 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-logs\") pod \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.998390 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-combined-ca-bundle\") pod \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.998580 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-config-data\") pod \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.998704 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xkx4\" (UniqueName: \"kubernetes.io/projected/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-kube-api-access-7xkx4\") pod \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\" (UID: \"077aaa12-36a8-4d01-bb86-c68d1c7a41ea\") " Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.998449 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-logs" (OuterVolumeSpecName: "logs") pod "077aaa12-36a8-4d01-bb86-c68d1c7a41ea" (UID: "077aaa12-36a8-4d01-bb86-c68d1c7a41ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:53:44 crc kubenswrapper[4697]: I0220 16:53:44.999486 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.018894 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-kube-api-access-7xkx4" (OuterVolumeSpecName: "kube-api-access-7xkx4") pod "077aaa12-36a8-4d01-bb86-c68d1c7a41ea" (UID: "077aaa12-36a8-4d01-bb86-c68d1c7a41ea"). InnerVolumeSpecName "kube-api-access-7xkx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.033563 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-config-data" (OuterVolumeSpecName: "config-data") pod "077aaa12-36a8-4d01-bb86-c68d1c7a41ea" (UID: "077aaa12-36a8-4d01-bb86-c68d1c7a41ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.046122 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "077aaa12-36a8-4d01-bb86-c68d1c7a41ea" (UID: "077aaa12-36a8-4d01-bb86-c68d1c7a41ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.103185 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.103226 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.103239 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xkx4\" (UniqueName: \"kubernetes.io/projected/077aaa12-36a8-4d01-bb86-c68d1c7a41ea-kube-api-access-7xkx4\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.107276 4697 generic.go:334] "Generic (PLEG): container finished" podID="077aaa12-36a8-4d01-bb86-c68d1c7a41ea" containerID="fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497" exitCode=0 Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.107310 4697 generic.go:334] "Generic (PLEG): container finished" podID="077aaa12-36a8-4d01-bb86-c68d1c7a41ea" containerID="fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff" exitCode=143 Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.107327 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"077aaa12-36a8-4d01-bb86-c68d1c7a41ea","Type":"ContainerDied","Data":"fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497"} Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.107395 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"077aaa12-36a8-4d01-bb86-c68d1c7a41ea","Type":"ContainerDied","Data":"fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff"} Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.107414 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"077aaa12-36a8-4d01-bb86-c68d1c7a41ea","Type":"ContainerDied","Data":"6dde2667ec7cfe2833876b99bbc97560f0e0184cb3b756963d6aa1d747629525"} Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.107476 4697 scope.go:117] "RemoveContainer" containerID="fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.107740 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.143215 4697 scope.go:117] "RemoveContainer" containerID="fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.171504 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.183270 4697 scope.go:117] "RemoveContainer" containerID="fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497" Feb 20 16:53:45 crc kubenswrapper[4697]: E0220 16:53:45.183722 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497\": container with ID starting with fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497 not found: ID does not exist" containerID="fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.183763 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497"} err="failed to get container status \"fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497\": rpc error: code = NotFound desc = could not find container \"fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497\": container with ID starting with fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497 not found: ID does not exist" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.183791 4697 scope.go:117] "RemoveContainer" containerID="fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff" Feb 20 16:53:45 crc kubenswrapper[4697]: E0220 16:53:45.184620 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff\": container with ID starting with fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff not found: ID does not exist" containerID="fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.184701 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff"} err="failed to get container status \"fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff\": rpc error: code = NotFound desc = could not find container \"fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff\": container with ID starting with fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff not found: ID does not exist" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.184727 4697 scope.go:117] "RemoveContainer" containerID="fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.188948 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.189606 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497"} err="failed to get container status \"fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497\": rpc error: code = NotFound desc = could not find container \"fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497\": container with ID starting with fc78fd28eebdf8ae47c213fa047394f423882ac31ffe3619c304f5478c848497 not found: ID does not exist" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.189639 4697 scope.go:117] "RemoveContainer" containerID="fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.193764 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff"} err="failed to get container status \"fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff\": rpc error: code = NotFound desc = could not find container \"fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff\": container with ID starting with fb3dda6f72881371d5e9ae297679d08ec609523010da08e865c2353e69aed4ff not found: ID does not exist" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.215272 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:45 crc kubenswrapper[4697]: E0220 16:53:45.220092 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077aaa12-36a8-4d01-bb86-c68d1c7a41ea" containerName="nova-metadata-metadata" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.220148 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="077aaa12-36a8-4d01-bb86-c68d1c7a41ea" containerName="nova-metadata-metadata" Feb 20 16:53:45 crc kubenswrapper[4697]: E0220 16:53:45.220204 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077aaa12-36a8-4d01-bb86-c68d1c7a41ea" containerName="nova-metadata-log" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.220215 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="077aaa12-36a8-4d01-bb86-c68d1c7a41ea" containerName="nova-metadata-log" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.220709 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="077aaa12-36a8-4d01-bb86-c68d1c7a41ea" containerName="nova-metadata-metadata" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.220788 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="077aaa12-36a8-4d01-bb86-c68d1c7a41ea" containerName="nova-metadata-log" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.222571 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.225933 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.226092 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.240315 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.417530 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-config-data\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.417576 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.417643 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxv5m\" (UniqueName: \"kubernetes.io/projected/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-kube-api-access-mxv5m\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.417881 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-logs\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.418015 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.520005 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-logs\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.520125 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.520195 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-config-data\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.520214 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.520275 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxv5m\" (UniqueName: \"kubernetes.io/projected/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-kube-api-access-mxv5m\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.520618 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-logs\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.528995 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.529137 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.539052 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-config-data\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.540737 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxv5m\" (UniqueName: \"kubernetes.io/projected/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-kube-api-access-mxv5m\") pod \"nova-metadata-0\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " pod="openstack/nova-metadata-0" Feb 20 16:53:45 crc kubenswrapper[4697]: I0220 16:53:45.561248 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:53:46 crc kubenswrapper[4697]: I0220 16:53:46.041493 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:46 crc kubenswrapper[4697]: I0220 16:53:46.118973 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb","Type":"ContainerStarted","Data":"deaad5604a1545c79df3fbd122abe22185118dc876477936aec552c765ec799d"} Feb 20 16:53:46 crc kubenswrapper[4697]: I0220 16:53:46.888478 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077aaa12-36a8-4d01-bb86-c68d1c7a41ea" path="/var/lib/kubelet/pods/077aaa12-36a8-4d01-bb86-c68d1c7a41ea/volumes" Feb 20 16:53:47 crc kubenswrapper[4697]: I0220 16:53:47.128594 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb","Type":"ContainerStarted","Data":"3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090"} Feb 20 16:53:47 crc kubenswrapper[4697]: I0220 16:53:47.128976 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb","Type":"ContainerStarted","Data":"fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944"} Feb 20 16:53:47 crc kubenswrapper[4697]: I0220 16:53:47.157356 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.157337684 podStartE2EDuration="2.157337684s" podCreationTimestamp="2026-02-20 16:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:53:47.153184649 +0000 UTC m=+1334.933230077" watchObservedRunningTime="2026-02-20 16:53:47.157337684 +0000 UTC m=+1334.937383092" Feb 20 16:53:48 crc kubenswrapper[4697]: I0220 16:53:48.139949 4697 generic.go:334] "Generic (PLEG): container finished" podID="7eebb045-e1b7-4b9d-9d41-9485997d1743" containerID="1e28fbf0ae4c050426541d75380fcd78c64c5fbd2c2607160ec0e5a8313c38ea" exitCode=0 Feb 20 16:53:48 crc kubenswrapper[4697]: I0220 16:53:48.139974 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nmwj6" event={"ID":"7eebb045-e1b7-4b9d-9d41-9485997d1743","Type":"ContainerDied","Data":"1e28fbf0ae4c050426541d75380fcd78c64c5fbd2c2607160ec0e5a8313c38ea"} Feb 20 16:53:48 crc kubenswrapper[4697]: I0220 16:53:48.600442 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 16:53:48 crc kubenswrapper[4697]: I0220 16:53:48.600787 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 16:53:48 crc kubenswrapper[4697]: I0220 16:53:48.962037 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 16:53:48 crc kubenswrapper[4697]: I0220 16:53:48.979341 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.002500 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.044497 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b5bbf5bf5-dr9md"] Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.045227 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" podUID="aeadc920-c9f4-4965-9b75-a233291a3f6a" containerName="dnsmasq-dns" containerID="cri-o://862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3" gracePeriod=10 Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.192780 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.624853 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.686534 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.686835 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.727315 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-scripts\") pod \"7eebb045-e1b7-4b9d-9d41-9485997d1743\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.727405 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vppr2\" (UniqueName: \"kubernetes.io/projected/7eebb045-e1b7-4b9d-9d41-9485997d1743-kube-api-access-vppr2\") pod \"7eebb045-e1b7-4b9d-9d41-9485997d1743\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.727543 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-combined-ca-bundle\") pod \"7eebb045-e1b7-4b9d-9d41-9485997d1743\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.727630 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-config-data\") pod \"7eebb045-e1b7-4b9d-9d41-9485997d1743\" (UID: \"7eebb045-e1b7-4b9d-9d41-9485997d1743\") " Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.733766 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eebb045-e1b7-4b9d-9d41-9485997d1743-kube-api-access-vppr2" (OuterVolumeSpecName: "kube-api-access-vppr2") pod "7eebb045-e1b7-4b9d-9d41-9485997d1743" (UID: "7eebb045-e1b7-4b9d-9d41-9485997d1743"). InnerVolumeSpecName "kube-api-access-vppr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.733799 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-scripts" (OuterVolumeSpecName: "scripts") pod "7eebb045-e1b7-4b9d-9d41-9485997d1743" (UID: "7eebb045-e1b7-4b9d-9d41-9485997d1743"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.761019 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7eebb045-e1b7-4b9d-9d41-9485997d1743" (UID: "7eebb045-e1b7-4b9d-9d41-9485997d1743"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.773631 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-config-data" (OuterVolumeSpecName: "config-data") pod "7eebb045-e1b7-4b9d-9d41-9485997d1743" (UID: "7eebb045-e1b7-4b9d-9d41-9485997d1743"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.800243 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.829574 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vppr2\" (UniqueName: \"kubernetes.io/projected/7eebb045-e1b7-4b9d-9d41-9485997d1743-kube-api-access-vppr2\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.829615 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.829629 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.829639 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eebb045-e1b7-4b9d-9d41-9485997d1743-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.930287 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-dns-svc\") pod \"aeadc920-c9f4-4965-9b75-a233291a3f6a\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.930343 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-dns-swift-storage-0\") pod \"aeadc920-c9f4-4965-9b75-a233291a3f6a\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.930375 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-ovsdbserver-sb\") pod \"aeadc920-c9f4-4965-9b75-a233291a3f6a\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.930390 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25dm9\" (UniqueName: \"kubernetes.io/projected/aeadc920-c9f4-4965-9b75-a233291a3f6a-kube-api-access-25dm9\") pod \"aeadc920-c9f4-4965-9b75-a233291a3f6a\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.930549 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-ovsdbserver-nb\") pod \"aeadc920-c9f4-4965-9b75-a233291a3f6a\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.930597 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-config\") pod \"aeadc920-c9f4-4965-9b75-a233291a3f6a\" (UID: \"aeadc920-c9f4-4965-9b75-a233291a3f6a\") " Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.935629 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeadc920-c9f4-4965-9b75-a233291a3f6a-kube-api-access-25dm9" (OuterVolumeSpecName: "kube-api-access-25dm9") pod "aeadc920-c9f4-4965-9b75-a233291a3f6a" (UID: "aeadc920-c9f4-4965-9b75-a233291a3f6a"). InnerVolumeSpecName "kube-api-access-25dm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.992673 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aeadc920-c9f4-4965-9b75-a233291a3f6a" (UID: "aeadc920-c9f4-4965-9b75-a233291a3f6a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.998706 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aeadc920-c9f4-4965-9b75-a233291a3f6a" (UID: "aeadc920-c9f4-4965-9b75-a233291a3f6a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:53:49 crc kubenswrapper[4697]: I0220 16:53:49.999536 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aeadc920-c9f4-4965-9b75-a233291a3f6a" (UID: "aeadc920-c9f4-4965-9b75-a233291a3f6a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.001805 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-config" (OuterVolumeSpecName: "config") pod "aeadc920-c9f4-4965-9b75-a233291a3f6a" (UID: "aeadc920-c9f4-4965-9b75-a233291a3f6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.013191 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aeadc920-c9f4-4965-9b75-a233291a3f6a" (UID: "aeadc920-c9f4-4965-9b75-a233291a3f6a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.032707 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.032931 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.032992 4697 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.033051 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.033106 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25dm9\" (UniqueName: \"kubernetes.io/projected/aeadc920-c9f4-4965-9b75-a233291a3f6a-kube-api-access-25dm9\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.033173 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeadc920-c9f4-4965-9b75-a233291a3f6a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.158745 4697 generic.go:334] "Generic (PLEG): container finished" podID="aeadc920-c9f4-4965-9b75-a233291a3f6a" containerID="862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3" exitCode=0 Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.158849 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.158853 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" event={"ID":"aeadc920-c9f4-4965-9b75-a233291a3f6a","Type":"ContainerDied","Data":"862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3"} Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.159245 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b5bbf5bf5-dr9md" event={"ID":"aeadc920-c9f4-4965-9b75-a233291a3f6a","Type":"ContainerDied","Data":"4a5fa885183ff64ba1f284e4c73bf3c9c0dcac4e7a8d66cbe2786c3a01962996"} Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.159282 4697 scope.go:117] "RemoveContainer" containerID="862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.161203 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nmwj6" event={"ID":"7eebb045-e1b7-4b9d-9d41-9485997d1743","Type":"ContainerDied","Data":"3efddde434d594170f8361e4acd4544a2a40df83ee25ddcda71d36ea7d9d2a31"} Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.161283 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3efddde434d594170f8361e4acd4544a2a40df83ee25ddcda71d36ea7d9d2a31" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.161214 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nmwj6" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.182367 4697 scope.go:117] "RemoveContainer" containerID="48333e517f570ff008ac05ba32b933cc91a12ae29e807f9d64bab4ee382e8b83" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.271880 4697 scope.go:117] "RemoveContainer" containerID="862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3" Feb 20 16:53:50 crc kubenswrapper[4697]: E0220 16:53:50.272288 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3\": container with ID starting with 862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3 not found: ID does not exist" containerID="862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.272313 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3"} err="failed to get container status \"862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3\": rpc error: code = NotFound desc = could not find container \"862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3\": container with ID starting with 862303edf14b51699ea174b9d7bd24b4efe72488f93a3ea9dedebf991f1943a3 not found: ID does not exist" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.272333 4697 scope.go:117] "RemoveContainer" containerID="48333e517f570ff008ac05ba32b933cc91a12ae29e807f9d64bab4ee382e8b83" Feb 20 16:53:50 crc kubenswrapper[4697]: E0220 16:53:50.272955 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48333e517f570ff008ac05ba32b933cc91a12ae29e807f9d64bab4ee382e8b83\": container with ID starting with 48333e517f570ff008ac05ba32b933cc91a12ae29e807f9d64bab4ee382e8b83 not found: ID does not exist" containerID="48333e517f570ff008ac05ba32b933cc91a12ae29e807f9d64bab4ee382e8b83" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.272983 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48333e517f570ff008ac05ba32b933cc91a12ae29e807f9d64bab4ee382e8b83"} err="failed to get container status \"48333e517f570ff008ac05ba32b933cc91a12ae29e807f9d64bab4ee382e8b83\": rpc error: code = NotFound desc = could not find container \"48333e517f570ff008ac05ba32b933cc91a12ae29e807f9d64bab4ee382e8b83\": container with ID starting with 48333e517f570ff008ac05ba32b933cc91a12ae29e807f9d64bab4ee382e8b83 not found: ID does not exist" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.345222 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b5bbf5bf5-dr9md"] Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.357153 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b5bbf5bf5-dr9md"] Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.365808 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.375851 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.376089 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" containerName="nova-api-log" containerID="cri-o://f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be" gracePeriod=30 Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.376216 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" containerName="nova-api-api" containerID="cri-o://fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3" gracePeriod=30 Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.421715 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.421937 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" containerName="nova-metadata-log" containerID="cri-o://fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944" gracePeriod=30 Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.422114 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" containerName="nova-metadata-metadata" containerID="cri-o://3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090" gracePeriod=30 Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.562198 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.562343 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.909311 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeadc920-c9f4-4965-9b75-a233291a3f6a" path="/var/lib/kubelet/pods/aeadc920-c9f4-4965-9b75-a233291a3f6a/volumes" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.938350 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.955944 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-nova-metadata-tls-certs\") pod \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.956012 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-combined-ca-bundle\") pod \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.956120 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-logs\") pod \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.956158 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-config-data\") pod \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.956189 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxv5m\" (UniqueName: \"kubernetes.io/projected/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-kube-api-access-mxv5m\") pod \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\" (UID: \"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb\") " Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.957758 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-logs" (OuterVolumeSpecName: "logs") pod "bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" (UID: "bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.958244 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:50 crc kubenswrapper[4697]: I0220 16:53:50.963104 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-kube-api-access-mxv5m" (OuterVolumeSpecName: "kube-api-access-mxv5m") pod "bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" (UID: "bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb"). InnerVolumeSpecName "kube-api-access-mxv5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.004259 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" (UID: "bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.011029 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-config-data" (OuterVolumeSpecName: "config-data") pod "bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" (UID: "bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.032757 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" (UID: "bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.060525 4697 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.060561 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.060581 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.060590 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxv5m\" (UniqueName: \"kubernetes.io/projected/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb-kube-api-access-mxv5m\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.172519 4697 generic.go:334] "Generic (PLEG): container finished" podID="bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" containerID="3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090" exitCode=0 Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.172557 4697 generic.go:334] "Generic (PLEG): container finished" podID="bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" containerID="fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944" exitCode=143 Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.172598 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb","Type":"ContainerDied","Data":"3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090"} Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.172629 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb","Type":"ContainerDied","Data":"fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944"} Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.172642 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb","Type":"ContainerDied","Data":"deaad5604a1545c79df3fbd122abe22185118dc876477936aec552c765ec799d"} Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.172661 4697 scope.go:117] "RemoveContainer" containerID="3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.172796 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.190872 4697 generic.go:334] "Generic (PLEG): container finished" podID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" containerID="f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be" exitCode=143 Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.191058 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b3506e35-dd19-4836-ab0f-b2d86aaaa3cc" containerName="nova-scheduler-scheduler" containerID="cri-o://cbb29ab5ea2659209e5663e93d75b97822989d723be8795a8bc02144fcfb6363" gracePeriod=30 Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.191334 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a","Type":"ContainerDied","Data":"f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be"} Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.258301 4697 scope.go:117] "RemoveContainer" containerID="fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.269502 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.278124 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.286521 4697 scope.go:117] "RemoveContainer" containerID="3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090" Feb 20 16:53:51 crc kubenswrapper[4697]: E0220 16:53:51.286914 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090\": container with ID starting with 3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090 not found: ID does not exist" containerID="3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.286952 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090"} err="failed to get container status \"3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090\": rpc error: code = NotFound desc = could not find container \"3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090\": container with ID starting with 3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090 not found: ID does not exist" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.286979 4697 scope.go:117] "RemoveContainer" containerID="fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944" Feb 20 16:53:51 crc kubenswrapper[4697]: E0220 16:53:51.287233 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944\": container with ID starting with fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944 not found: ID does not exist" containerID="fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.287265 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944"} err="failed to get container status \"fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944\": rpc error: code = NotFound desc = could not find container \"fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944\": container with ID starting with fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944 not found: ID does not exist" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.287290 4697 scope.go:117] "RemoveContainer" containerID="3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.287645 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090"} err="failed to get container status \"3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090\": rpc error: code = NotFound desc = could not find container \"3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090\": container with ID starting with 3a7db1001186ec24f9a10a555a29589140b21f99c07d50df101f0a7a5fccd090 not found: ID does not exist" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.287677 4697 scope.go:117] "RemoveContainer" containerID="fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.287974 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944"} err="failed to get container status \"fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944\": rpc error: code = NotFound desc = could not find container \"fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944\": container with ID starting with fb0d928b11ce78967a1beeed603a289dc3fdc09cba82103c4ad17cc0f0a28944 not found: ID does not exist" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.327638 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:51 crc kubenswrapper[4697]: E0220 16:53:51.328029 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeadc920-c9f4-4965-9b75-a233291a3f6a" containerName="init" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.328045 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeadc920-c9f4-4965-9b75-a233291a3f6a" containerName="init" Feb 20 16:53:51 crc kubenswrapper[4697]: E0220 16:53:51.328054 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" containerName="nova-metadata-log" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.328060 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" containerName="nova-metadata-log" Feb 20 16:53:51 crc kubenswrapper[4697]: E0220 16:53:51.328074 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" containerName="nova-metadata-metadata" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.328081 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" containerName="nova-metadata-metadata" Feb 20 16:53:51 crc kubenswrapper[4697]: E0220 16:53:51.328103 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eebb045-e1b7-4b9d-9d41-9485997d1743" containerName="nova-manage" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.328108 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eebb045-e1b7-4b9d-9d41-9485997d1743" containerName="nova-manage" Feb 20 16:53:51 crc kubenswrapper[4697]: E0220 16:53:51.328121 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeadc920-c9f4-4965-9b75-a233291a3f6a" containerName="dnsmasq-dns" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.328127 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeadc920-c9f4-4965-9b75-a233291a3f6a" containerName="dnsmasq-dns" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.328293 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" containerName="nova-metadata-log" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.328301 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeadc920-c9f4-4965-9b75-a233291a3f6a" containerName="dnsmasq-dns" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.328317 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" containerName="nova-metadata-metadata" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.328329 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eebb045-e1b7-4b9d-9d41-9485997d1743" containerName="nova-manage" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.329287 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.335458 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.335689 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.358561 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.381251 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-config-data\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.381361 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.381469 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t24c\" (UniqueName: \"kubernetes.io/projected/b69887b6-1e06-4500-b650-1fb06bed56c7-kube-api-access-7t24c\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.381520 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69887b6-1e06-4500-b650-1fb06bed56c7-logs\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.381561 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.483771 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-config-data\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.483851 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.483908 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t24c\" (UniqueName: \"kubernetes.io/projected/b69887b6-1e06-4500-b650-1fb06bed56c7-kube-api-access-7t24c\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.483946 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69887b6-1e06-4500-b650-1fb06bed56c7-logs\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.483975 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.484605 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69887b6-1e06-4500-b650-1fb06bed56c7-logs\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.489109 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.500356 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-config-data\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.505008 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.527012 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t24c\" (UniqueName: \"kubernetes.io/projected/b69887b6-1e06-4500-b650-1fb06bed56c7-kube-api-access-7t24c\") pod \"nova-metadata-0\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " pod="openstack/nova-metadata-0" Feb 20 16:53:51 crc kubenswrapper[4697]: I0220 16:53:51.659869 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:53:52 crc kubenswrapper[4697]: I0220 16:53:52.188320 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:53:52 crc kubenswrapper[4697]: I0220 16:53:52.203323 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b69887b6-1e06-4500-b650-1fb06bed56c7","Type":"ContainerStarted","Data":"7d4da44a36ffa2669e9eff0879eaf415e5308a11d4f18405eb3309e3e5bbef4a"} Feb 20 16:53:52 crc kubenswrapper[4697]: I0220 16:53:52.894469 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb" path="/var/lib/kubelet/pods/bd8ef74e-f9b6-473a-bcfc-0ea255c8c4bb/volumes" Feb 20 16:53:53 crc kubenswrapper[4697]: I0220 16:53:53.221932 4697 generic.go:334] "Generic (PLEG): container finished" podID="33dd1dad-ee40-4520-9d04-56a1b69894a0" containerID="2fab0bf3cef5f45dbbf1a0d28d2e66edcfc757a847d2de5005c363a251a621c0" exitCode=0 Feb 20 16:53:53 crc kubenswrapper[4697]: I0220 16:53:53.222047 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6bjzj" event={"ID":"33dd1dad-ee40-4520-9d04-56a1b69894a0","Type":"ContainerDied","Data":"2fab0bf3cef5f45dbbf1a0d28d2e66edcfc757a847d2de5005c363a251a621c0"} Feb 20 16:53:53 crc kubenswrapper[4697]: I0220 16:53:53.225998 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b69887b6-1e06-4500-b650-1fb06bed56c7","Type":"ContainerStarted","Data":"74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a"} Feb 20 16:53:53 crc kubenswrapper[4697]: I0220 16:53:53.226069 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b69887b6-1e06-4500-b650-1fb06bed56c7","Type":"ContainerStarted","Data":"a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41"} Feb 20 16:53:53 crc kubenswrapper[4697]: I0220 16:53:53.268579 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.268563496 podStartE2EDuration="2.268563496s" podCreationTimestamp="2026-02-20 16:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:53:53.264465292 +0000 UTC m=+1341.044510710" watchObservedRunningTime="2026-02-20 16:53:53.268563496 +0000 UTC m=+1341.048608904" Feb 20 16:53:53 crc kubenswrapper[4697]: E0220 16:53:53.717303 4697 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd1fa9a9_3270_47ee_97a5_37cc4fbb9c1a.slice/crio-conmon-fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3.scope\": RecentStats: unable to find data in memory cache]" Feb 20 16:53:53 crc kubenswrapper[4697]: I0220 16:53:53.941977 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:53:53 crc kubenswrapper[4697]: E0220 16:53:53.967263 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbb29ab5ea2659209e5663e93d75b97822989d723be8795a8bc02144fcfb6363" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 16:53:53 crc kubenswrapper[4697]: E0220 16:53:53.968902 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbb29ab5ea2659209e5663e93d75b97822989d723be8795a8bc02144fcfb6363" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 16:53:53 crc kubenswrapper[4697]: E0220 16:53:53.972075 4697 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbb29ab5ea2659209e5663e93d75b97822989d723be8795a8bc02144fcfb6363" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 16:53:53 crc kubenswrapper[4697]: E0220 16:53:53.972165 4697 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b3506e35-dd19-4836-ab0f-b2d86aaaa3cc" containerName="nova-scheduler-scheduler" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.061209 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-logs\") pod \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.061295 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-combined-ca-bundle\") pod \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.061459 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k784\" (UniqueName: \"kubernetes.io/projected/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-kube-api-access-4k784\") pod \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.061537 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-config-data\") pod \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\" (UID: \"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a\") " Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.061808 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-logs" (OuterVolumeSpecName: "logs") pod "bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" (UID: "bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.061945 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.069672 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-kube-api-access-4k784" (OuterVolumeSpecName: "kube-api-access-4k784") pod "bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" (UID: "bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a"). InnerVolumeSpecName "kube-api-access-4k784". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.093714 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-config-data" (OuterVolumeSpecName: "config-data") pod "bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" (UID: "bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.101208 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" (UID: "bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.163558 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k784\" (UniqueName: \"kubernetes.io/projected/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-kube-api-access-4k784\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.163590 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.163602 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.235109 4697 generic.go:334] "Generic (PLEG): container finished" podID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" containerID="fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3" exitCode=0 Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.235171 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.235241 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a","Type":"ContainerDied","Data":"fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3"} Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.235292 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a","Type":"ContainerDied","Data":"f8e43648488e7b29a4c2bd16d74a296662e1a0e046b1a0d47b0afdef5a569cb7"} Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.235332 4697 scope.go:117] "RemoveContainer" containerID="fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.282308 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.283615 4697 scope.go:117] "RemoveContainer" containerID="f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.295572 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.311199 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 16:53:54 crc kubenswrapper[4697]: E0220 16:53:54.311792 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" containerName="nova-api-log" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.311915 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" containerName="nova-api-log" Feb 20 16:53:54 crc kubenswrapper[4697]: E0220 16:53:54.312050 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" containerName="nova-api-api" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.312111 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" containerName="nova-api-api" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.312357 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" containerName="nova-api-log" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.312426 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" containerName="nova-api-api" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.313519 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.320246 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.332261 4697 scope.go:117] "RemoveContainer" containerID="fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3" Feb 20 16:53:54 crc kubenswrapper[4697]: E0220 16:53:54.335927 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3\": container with ID starting with fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3 not found: ID does not exist" containerID="fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.335992 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3"} err="failed to get container status \"fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3\": rpc error: code = NotFound desc = could not find container \"fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3\": container with ID starting with fd170fae65fac0999c56cf75ac3a0932a38bb48d4231ded9ac898baffe9915a3 not found: ID does not exist" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.336024 4697 scope.go:117] "RemoveContainer" containerID="f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be" Feb 20 16:53:54 crc kubenswrapper[4697]: E0220 16:53:54.338523 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be\": container with ID starting with f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be not found: ID does not exist" containerID="f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.338574 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be"} err="failed to get container status \"f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be\": rpc error: code = NotFound desc = could not find container \"f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be\": container with ID starting with f94a1168e90fdc3358c4baa4d29d2ab3b59c952abee5e2ef9c77521ec89433be not found: ID does not exist" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.342288 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.376069 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.376244 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-logs\") pod \"nova-api-0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.376306 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-config-data\") pod \"nova-api-0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.376484 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2z7f\" (UniqueName: \"kubernetes.io/projected/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-kube-api-access-b2z7f\") pod \"nova-api-0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.478921 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2z7f\" (UniqueName: \"kubernetes.io/projected/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-kube-api-access-b2z7f\") pod \"nova-api-0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.479354 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.479484 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-logs\") pod \"nova-api-0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.479529 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-config-data\") pod \"nova-api-0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.479964 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-logs\") pod \"nova-api-0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.484772 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.485190 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-config-data\") pod \"nova-api-0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.497955 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2z7f\" (UniqueName: \"kubernetes.io/projected/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-kube-api-access-b2z7f\") pod \"nova-api-0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.573726 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.639629 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.681886 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-combined-ca-bundle\") pod \"33dd1dad-ee40-4520-9d04-56a1b69894a0\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.682052 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xpgv\" (UniqueName: \"kubernetes.io/projected/33dd1dad-ee40-4520-9d04-56a1b69894a0-kube-api-access-2xpgv\") pod \"33dd1dad-ee40-4520-9d04-56a1b69894a0\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.682084 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-scripts\") pod \"33dd1dad-ee40-4520-9d04-56a1b69894a0\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.682221 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-config-data\") pod \"33dd1dad-ee40-4520-9d04-56a1b69894a0\" (UID: \"33dd1dad-ee40-4520-9d04-56a1b69894a0\") " Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.686349 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33dd1dad-ee40-4520-9d04-56a1b69894a0-kube-api-access-2xpgv" (OuterVolumeSpecName: "kube-api-access-2xpgv") pod "33dd1dad-ee40-4520-9d04-56a1b69894a0" (UID: "33dd1dad-ee40-4520-9d04-56a1b69894a0"). InnerVolumeSpecName "kube-api-access-2xpgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.686923 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-scripts" (OuterVolumeSpecName: "scripts") pod "33dd1dad-ee40-4520-9d04-56a1b69894a0" (UID: "33dd1dad-ee40-4520-9d04-56a1b69894a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.710739 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33dd1dad-ee40-4520-9d04-56a1b69894a0" (UID: "33dd1dad-ee40-4520-9d04-56a1b69894a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.711780 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-config-data" (OuterVolumeSpecName: "config-data") pod "33dd1dad-ee40-4520-9d04-56a1b69894a0" (UID: "33dd1dad-ee40-4520-9d04-56a1b69894a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.784157 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.784467 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.784483 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xpgv\" (UniqueName: \"kubernetes.io/projected/33dd1dad-ee40-4520-9d04-56a1b69894a0-kube-api-access-2xpgv\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.784493 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33dd1dad-ee40-4520-9d04-56a1b69894a0-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:54 crc kubenswrapper[4697]: I0220 16:53:54.920174 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a" path="/var/lib/kubelet/pods/bd1fa9a9-3270-47ee-97a5-37cc4fbb9c1a/volumes" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.102868 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.248035 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6bjzj" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.248208 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6bjzj" event={"ID":"33dd1dad-ee40-4520-9d04-56a1b69894a0","Type":"ContainerDied","Data":"501cd23b0426ab3caac8c705d63d3f8e2fdbf59ea0d08d13d1bda48f6a5e0e95"} Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.248243 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="501cd23b0426ab3caac8c705d63d3f8e2fdbf59ea0d08d13d1bda48f6a5e0e95" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.252676 4697 generic.go:334] "Generic (PLEG): container finished" podID="b3506e35-dd19-4836-ab0f-b2d86aaaa3cc" containerID="cbb29ab5ea2659209e5663e93d75b97822989d723be8795a8bc02144fcfb6363" exitCode=0 Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.252735 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc","Type":"ContainerDied","Data":"cbb29ab5ea2659209e5663e93d75b97822989d723be8795a8bc02144fcfb6363"} Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.256089 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"92b2a1ec-bc80-46bc-bb32-23fb81c317f0","Type":"ContainerStarted","Data":"0e5b77b346c61e9bf02571ba4b360b3d290b85b0864ba7edc44d1a116bf85a1f"} Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.256119 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"92b2a1ec-bc80-46bc-bb32-23fb81c317f0","Type":"ContainerStarted","Data":"224cbcfb85a3f4b9e734ff1b3e4fa9467204bfcc56bedf37cafb3e834bbe093a"} Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.288801 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.329552 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 16:53:55 crc kubenswrapper[4697]: E0220 16:53:55.330173 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3506e35-dd19-4836-ab0f-b2d86aaaa3cc" containerName="nova-scheduler-scheduler" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.330241 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3506e35-dd19-4836-ab0f-b2d86aaaa3cc" containerName="nova-scheduler-scheduler" Feb 20 16:53:55 crc kubenswrapper[4697]: E0220 16:53:55.330353 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33dd1dad-ee40-4520-9d04-56a1b69894a0" containerName="nova-cell1-conductor-db-sync" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.330418 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="33dd1dad-ee40-4520-9d04-56a1b69894a0" containerName="nova-cell1-conductor-db-sync" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.330692 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3506e35-dd19-4836-ab0f-b2d86aaaa3cc" containerName="nova-scheduler-scheduler" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.330767 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="33dd1dad-ee40-4520-9d04-56a1b69894a0" containerName="nova-cell1-conductor-db-sync" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.331534 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.337764 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.354420 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.396225 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-config-data\") pod \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\" (UID: \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\") " Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.396393 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nrnw\" (UniqueName: \"kubernetes.io/projected/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-kube-api-access-2nrnw\") pod \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\" (UID: \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\") " Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.396444 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-combined-ca-bundle\") pod \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\" (UID: \"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc\") " Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.396783 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52da959-5d9a-4a66-8907-400b5bd0acfa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a52da959-5d9a-4a66-8907-400b5bd0acfa\") " pod="openstack/nova-cell1-conductor-0" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.396808 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a52da959-5d9a-4a66-8907-400b5bd0acfa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a52da959-5d9a-4a66-8907-400b5bd0acfa\") " pod="openstack/nova-cell1-conductor-0" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.396842 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgndc\" (UniqueName: \"kubernetes.io/projected/a52da959-5d9a-4a66-8907-400b5bd0acfa-kube-api-access-hgndc\") pod \"nova-cell1-conductor-0\" (UID: \"a52da959-5d9a-4a66-8907-400b5bd0acfa\") " pod="openstack/nova-cell1-conductor-0" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.407514 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-kube-api-access-2nrnw" (OuterVolumeSpecName: "kube-api-access-2nrnw") pod "b3506e35-dd19-4836-ab0f-b2d86aaaa3cc" (UID: "b3506e35-dd19-4836-ab0f-b2d86aaaa3cc"). InnerVolumeSpecName "kube-api-access-2nrnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.431372 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3506e35-dd19-4836-ab0f-b2d86aaaa3cc" (UID: "b3506e35-dd19-4836-ab0f-b2d86aaaa3cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.448616 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-config-data" (OuterVolumeSpecName: "config-data") pod "b3506e35-dd19-4836-ab0f-b2d86aaaa3cc" (UID: "b3506e35-dd19-4836-ab0f-b2d86aaaa3cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.500578 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52da959-5d9a-4a66-8907-400b5bd0acfa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a52da959-5d9a-4a66-8907-400b5bd0acfa\") " pod="openstack/nova-cell1-conductor-0" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.500634 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a52da959-5d9a-4a66-8907-400b5bd0acfa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a52da959-5d9a-4a66-8907-400b5bd0acfa\") " pod="openstack/nova-cell1-conductor-0" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.500699 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgndc\" (UniqueName: \"kubernetes.io/projected/a52da959-5d9a-4a66-8907-400b5bd0acfa-kube-api-access-hgndc\") pod \"nova-cell1-conductor-0\" (UID: \"a52da959-5d9a-4a66-8907-400b5bd0acfa\") " pod="openstack/nova-cell1-conductor-0" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.501005 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.501024 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nrnw\" (UniqueName: \"kubernetes.io/projected/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-kube-api-access-2nrnw\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.501040 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.527553 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a52da959-5d9a-4a66-8907-400b5bd0acfa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a52da959-5d9a-4a66-8907-400b5bd0acfa\") " pod="openstack/nova-cell1-conductor-0" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.528830 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgndc\" (UniqueName: \"kubernetes.io/projected/a52da959-5d9a-4a66-8907-400b5bd0acfa-kube-api-access-hgndc\") pod \"nova-cell1-conductor-0\" (UID: \"a52da959-5d9a-4a66-8907-400b5bd0acfa\") " pod="openstack/nova-cell1-conductor-0" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.529021 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52da959-5d9a-4a66-8907-400b5bd0acfa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a52da959-5d9a-4a66-8907-400b5bd0acfa\") " pod="openstack/nova-cell1-conductor-0" Feb 20 16:53:55 crc kubenswrapper[4697]: I0220 16:53:55.679619 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 16:53:56 crc kubenswrapper[4697]: W0220 16:53:56.119941 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda52da959_5d9a_4a66_8907_400b5bd0acfa.slice/crio-62445368478c2a3f4bd734b9217011cd4bfee6ee7a112d62a84622def71949d6 WatchSource:0}: Error finding container 62445368478c2a3f4bd734b9217011cd4bfee6ee7a112d62a84622def71949d6: Status 404 returned error can't find the container with id 62445368478c2a3f4bd734b9217011cd4bfee6ee7a112d62a84622def71949d6 Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.124830 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.201456 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.272711 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.276518 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3506e35-dd19-4836-ab0f-b2d86aaaa3cc","Type":"ContainerDied","Data":"194ca8e56c130f90c09fc932972f5c49aa288f145791c32043ededd58e304467"} Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.276582 4697 scope.go:117] "RemoveContainer" containerID="cbb29ab5ea2659209e5663e93d75b97822989d723be8795a8bc02144fcfb6363" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.280700 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"92b2a1ec-bc80-46bc-bb32-23fb81c317f0","Type":"ContainerStarted","Data":"3f31414cefa11dd99157f117dc9b56bbf6c962f8588f150161df2bd61702f4f0"} Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.289612 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a52da959-5d9a-4a66-8907-400b5bd0acfa","Type":"ContainerStarted","Data":"62445368478c2a3f4bd734b9217011cd4bfee6ee7a112d62a84622def71949d6"} Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.324749 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.324729951 podStartE2EDuration="2.324729951s" podCreationTimestamp="2026-02-20 16:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:53:56.322611148 +0000 UTC m=+1344.102656556" watchObservedRunningTime="2026-02-20 16:53:56.324729951 +0000 UTC m=+1344.104775359" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.355595 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.377577 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.398459 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.400068 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.403301 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.430984 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.521248 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f19f5-be99-41dd-9d8c-74dc997ee184-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"709f19f5-be99-41dd-9d8c-74dc997ee184\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.521309 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709f19f5-be99-41dd-9d8c-74dc997ee184-config-data\") pod \"nova-scheduler-0\" (UID: \"709f19f5-be99-41dd-9d8c-74dc997ee184\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.521329 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqvnp\" (UniqueName: \"kubernetes.io/projected/709f19f5-be99-41dd-9d8c-74dc997ee184-kube-api-access-bqvnp\") pod \"nova-scheduler-0\" (UID: \"709f19f5-be99-41dd-9d8c-74dc997ee184\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.623983 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f19f5-be99-41dd-9d8c-74dc997ee184-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"709f19f5-be99-41dd-9d8c-74dc997ee184\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.624976 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709f19f5-be99-41dd-9d8c-74dc997ee184-config-data\") pod \"nova-scheduler-0\" (UID: \"709f19f5-be99-41dd-9d8c-74dc997ee184\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.625066 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqvnp\" (UniqueName: \"kubernetes.io/projected/709f19f5-be99-41dd-9d8c-74dc997ee184-kube-api-access-bqvnp\") pod \"nova-scheduler-0\" (UID: \"709f19f5-be99-41dd-9d8c-74dc997ee184\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.628321 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f19f5-be99-41dd-9d8c-74dc997ee184-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"709f19f5-be99-41dd-9d8c-74dc997ee184\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.630047 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709f19f5-be99-41dd-9d8c-74dc997ee184-config-data\") pod \"nova-scheduler-0\" (UID: \"709f19f5-be99-41dd-9d8c-74dc997ee184\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.640978 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqvnp\" (UniqueName: \"kubernetes.io/projected/709f19f5-be99-41dd-9d8c-74dc997ee184-kube-api-access-bqvnp\") pod \"nova-scheduler-0\" (UID: \"709f19f5-be99-41dd-9d8c-74dc997ee184\") " pod="openstack/nova-scheduler-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.660857 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.660902 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.743251 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 16:53:56 crc kubenswrapper[4697]: I0220 16:53:56.898732 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3506e35-dd19-4836-ab0f-b2d86aaaa3cc" path="/var/lib/kubelet/pods/b3506e35-dd19-4836-ab0f-b2d86aaaa3cc/volumes" Feb 20 16:53:57 crc kubenswrapper[4697]: W0220 16:53:57.206690 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod709f19f5_be99_41dd_9d8c_74dc997ee184.slice/crio-f0c93d0c857472be32593ecf96c0b350803f4dff6a28c14e1bbde99c8847cdef WatchSource:0}: Error finding container f0c93d0c857472be32593ecf96c0b350803f4dff6a28c14e1bbde99c8847cdef: Status 404 returned error can't find the container with id f0c93d0c857472be32593ecf96c0b350803f4dff6a28c14e1bbde99c8847cdef Feb 20 16:53:57 crc kubenswrapper[4697]: I0220 16:53:57.208771 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:53:57 crc kubenswrapper[4697]: I0220 16:53:57.309648 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a52da959-5d9a-4a66-8907-400b5bd0acfa","Type":"ContainerStarted","Data":"5a3d226216f0cc8efd0e3da4f4f4c00ed5f9b2495af6a41c320998c285f8d39e"} Feb 20 16:53:57 crc kubenswrapper[4697]: I0220 16:53:57.310096 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 20 16:53:57 crc kubenswrapper[4697]: I0220 16:53:57.314000 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"709f19f5-be99-41dd-9d8c-74dc997ee184","Type":"ContainerStarted","Data":"f0c93d0c857472be32593ecf96c0b350803f4dff6a28c14e1bbde99c8847cdef"} Feb 20 16:53:57 crc kubenswrapper[4697]: I0220 16:53:57.326064 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.326048166 podStartE2EDuration="2.326048166s" podCreationTimestamp="2026-02-20 16:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:53:57.323854191 +0000 UTC m=+1345.103899609" watchObservedRunningTime="2026-02-20 16:53:57.326048166 +0000 UTC m=+1345.106093574" Feb 20 16:53:58 crc kubenswrapper[4697]: I0220 16:53:58.331197 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"709f19f5-be99-41dd-9d8c-74dc997ee184","Type":"ContainerStarted","Data":"66f5ba3111da8f26766fceaff09ab9017e64e506cbddb324566e751993f84279"} Feb 20 16:53:58 crc kubenswrapper[4697]: I0220 16:53:58.379353 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.379337035 podStartE2EDuration="2.379337035s" podCreationTimestamp="2026-02-20 16:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:53:58.368274715 +0000 UTC m=+1346.148320123" watchObservedRunningTime="2026-02-20 16:53:58.379337035 +0000 UTC m=+1346.159382443" Feb 20 16:54:00 crc kubenswrapper[4697]: I0220 16:54:00.088247 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 16:54:00 crc kubenswrapper[4697]: I0220 16:54:00.088749 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="9e054b02-2e5f-4205-a29b-e1365412c207" containerName="kube-state-metrics" containerID="cri-o://1460807ae0756fc1a012f435cf98d8e80d1d937277413601e95464f9c5149bcd" gracePeriod=30 Feb 20 16:54:00 crc kubenswrapper[4697]: I0220 16:54:00.372564 4697 generic.go:334] "Generic (PLEG): container finished" podID="9e054b02-2e5f-4205-a29b-e1365412c207" containerID="1460807ae0756fc1a012f435cf98d8e80d1d937277413601e95464f9c5149bcd" exitCode=2 Feb 20 16:54:00 crc kubenswrapper[4697]: I0220 16:54:00.372606 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9e054b02-2e5f-4205-a29b-e1365412c207","Type":"ContainerDied","Data":"1460807ae0756fc1a012f435cf98d8e80d1d937277413601e95464f9c5149bcd"} Feb 20 16:54:00 crc kubenswrapper[4697]: I0220 16:54:00.621570 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 16:54:00 crc kubenswrapper[4697]: I0220 16:54:00.704000 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4sb2\" (UniqueName: \"kubernetes.io/projected/9e054b02-2e5f-4205-a29b-e1365412c207-kube-api-access-k4sb2\") pod \"9e054b02-2e5f-4205-a29b-e1365412c207\" (UID: \"9e054b02-2e5f-4205-a29b-e1365412c207\") " Feb 20 16:54:00 crc kubenswrapper[4697]: I0220 16:54:00.732269 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e054b02-2e5f-4205-a29b-e1365412c207-kube-api-access-k4sb2" (OuterVolumeSpecName: "kube-api-access-k4sb2") pod "9e054b02-2e5f-4205-a29b-e1365412c207" (UID: "9e054b02-2e5f-4205-a29b-e1365412c207"). InnerVolumeSpecName "kube-api-access-k4sb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:54:00 crc kubenswrapper[4697]: I0220 16:54:00.806280 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4sb2\" (UniqueName: \"kubernetes.io/projected/9e054b02-2e5f-4205-a29b-e1365412c207-kube-api-access-k4sb2\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.381979 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9e054b02-2e5f-4205-a29b-e1365412c207","Type":"ContainerDied","Data":"9f0ed8ebb271460b28518b7d496b1ec1f07b2195bb59b3954c35d0850d3e8929"} Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.382046 4697 scope.go:117] "RemoveContainer" containerID="1460807ae0756fc1a012f435cf98d8e80d1d937277413601e95464f9c5149bcd" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.382208 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.408581 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.432305 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.442403 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 16:54:01 crc kubenswrapper[4697]: E0220 16:54:01.442854 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e054b02-2e5f-4205-a29b-e1365412c207" containerName="kube-state-metrics" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.442872 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e054b02-2e5f-4205-a29b-e1365412c207" containerName="kube-state-metrics" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.443081 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e054b02-2e5f-4205-a29b-e1365412c207" containerName="kube-state-metrics" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.443782 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.446794 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.447637 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.455787 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.519408 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/77b717c8-8b8d-4236-bd5e-95fb768f1f89-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"77b717c8-8b8d-4236-bd5e-95fb768f1f89\") " pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.519606 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br4dh\" (UniqueName: \"kubernetes.io/projected/77b717c8-8b8d-4236-bd5e-95fb768f1f89-kube-api-access-br4dh\") pod \"kube-state-metrics-0\" (UID: \"77b717c8-8b8d-4236-bd5e-95fb768f1f89\") " pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.519693 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/77b717c8-8b8d-4236-bd5e-95fb768f1f89-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"77b717c8-8b8d-4236-bd5e-95fb768f1f89\") " pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.519768 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77b717c8-8b8d-4236-bd5e-95fb768f1f89-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"77b717c8-8b8d-4236-bd5e-95fb768f1f89\") " pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.621816 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/77b717c8-8b8d-4236-bd5e-95fb768f1f89-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"77b717c8-8b8d-4236-bd5e-95fb768f1f89\") " pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.621878 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br4dh\" (UniqueName: \"kubernetes.io/projected/77b717c8-8b8d-4236-bd5e-95fb768f1f89-kube-api-access-br4dh\") pod \"kube-state-metrics-0\" (UID: \"77b717c8-8b8d-4236-bd5e-95fb768f1f89\") " pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.621929 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/77b717c8-8b8d-4236-bd5e-95fb768f1f89-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"77b717c8-8b8d-4236-bd5e-95fb768f1f89\") " pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.621975 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77b717c8-8b8d-4236-bd5e-95fb768f1f89-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"77b717c8-8b8d-4236-bd5e-95fb768f1f89\") " pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.626679 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/77b717c8-8b8d-4236-bd5e-95fb768f1f89-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"77b717c8-8b8d-4236-bd5e-95fb768f1f89\") " pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.628176 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77b717c8-8b8d-4236-bd5e-95fb768f1f89-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"77b717c8-8b8d-4236-bd5e-95fb768f1f89\") " pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.631099 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/77b717c8-8b8d-4236-bd5e-95fb768f1f89-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"77b717c8-8b8d-4236-bd5e-95fb768f1f89\") " pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.648563 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br4dh\" (UniqueName: \"kubernetes.io/projected/77b717c8-8b8d-4236-bd5e-95fb768f1f89-kube-api-access-br4dh\") pod \"kube-state-metrics-0\" (UID: \"77b717c8-8b8d-4236-bd5e-95fb768f1f89\") " pod="openstack/kube-state-metrics-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.662594 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.662632 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.745282 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 16:54:01 crc kubenswrapper[4697]: I0220 16:54:01.768034 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 16:54:02 crc kubenswrapper[4697]: W0220 16:54:02.259545 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77b717c8_8b8d_4236_bd5e_95fb768f1f89.slice/crio-025834be5da40b2139d20f0ab8b37059ad568aa0183dd32a8da09ec5acf4e232 WatchSource:0}: Error finding container 025834be5da40b2139d20f0ab8b37059ad568aa0183dd32a8da09ec5acf4e232: Status 404 returned error can't find the container with id 025834be5da40b2139d20f0ab8b37059ad568aa0183dd32a8da09ec5acf4e232 Feb 20 16:54:02 crc kubenswrapper[4697]: I0220 16:54:02.261213 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 16:54:02 crc kubenswrapper[4697]: I0220 16:54:02.426759 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77b717c8-8b8d-4236-bd5e-95fb768f1f89","Type":"ContainerStarted","Data":"025834be5da40b2139d20f0ab8b37059ad568aa0183dd32a8da09ec5acf4e232"} Feb 20 16:54:02 crc kubenswrapper[4697]: I0220 16:54:02.447366 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:54:02 crc kubenswrapper[4697]: I0220 16:54:02.447649 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="ceilometer-central-agent" containerID="cri-o://f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182" gracePeriod=30 Feb 20 16:54:02 crc kubenswrapper[4697]: I0220 16:54:02.447719 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="sg-core" containerID="cri-o://eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346" gracePeriod=30 Feb 20 16:54:02 crc kubenswrapper[4697]: I0220 16:54:02.447767 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="ceilometer-notification-agent" containerID="cri-o://717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931" gracePeriod=30 Feb 20 16:54:02 crc kubenswrapper[4697]: I0220 16:54:02.447904 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="proxy-httpd" containerID="cri-o://2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463" gracePeriod=30 Feb 20 16:54:02 crc kubenswrapper[4697]: I0220 16:54:02.680658 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b69887b6-1e06-4500-b650-1fb06bed56c7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 16:54:02 crc kubenswrapper[4697]: I0220 16:54:02.680740 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b69887b6-1e06-4500-b650-1fb06bed56c7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 16:54:02 crc kubenswrapper[4697]: I0220 16:54:02.893690 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e054b02-2e5f-4205-a29b-e1365412c207" path="/var/lib/kubelet/pods/9e054b02-2e5f-4205-a29b-e1365412c207/volumes" Feb 20 16:54:03 crc kubenswrapper[4697]: I0220 16:54:03.436180 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77b717c8-8b8d-4236-bd5e-95fb768f1f89","Type":"ContainerStarted","Data":"5f229235cc4b0e5b1d3d395b57ca4bb98747c62c8e4fba8412831d1ea1ea3c7a"} Feb 20 16:54:03 crc kubenswrapper[4697]: I0220 16:54:03.437969 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 20 16:54:03 crc kubenswrapper[4697]: I0220 16:54:03.440903 4697 generic.go:334] "Generic (PLEG): container finished" podID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerID="2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463" exitCode=0 Feb 20 16:54:03 crc kubenswrapper[4697]: I0220 16:54:03.440957 4697 generic.go:334] "Generic (PLEG): container finished" podID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerID="eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346" exitCode=2 Feb 20 16:54:03 crc kubenswrapper[4697]: I0220 16:54:03.440985 4697 generic.go:334] "Generic (PLEG): container finished" podID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerID="f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182" exitCode=0 Feb 20 16:54:03 crc kubenswrapper[4697]: I0220 16:54:03.440959 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fecc4170-b5a7-4f50-b0c5-99408fae97a8","Type":"ContainerDied","Data":"2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463"} Feb 20 16:54:03 crc kubenswrapper[4697]: I0220 16:54:03.441050 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fecc4170-b5a7-4f50-b0c5-99408fae97a8","Type":"ContainerDied","Data":"eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346"} Feb 20 16:54:03 crc kubenswrapper[4697]: I0220 16:54:03.441080 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fecc4170-b5a7-4f50-b0c5-99408fae97a8","Type":"ContainerDied","Data":"f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182"} Feb 20 16:54:03 crc kubenswrapper[4697]: I0220 16:54:03.458608 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.932721476 podStartE2EDuration="2.458579076s" podCreationTimestamp="2026-02-20 16:54:01 +0000 UTC" firstStartedPulling="2026-02-20 16:54:02.273247147 +0000 UTC m=+1350.053292555" lastFinishedPulling="2026-02-20 16:54:02.799104757 +0000 UTC m=+1350.579150155" observedRunningTime="2026-02-20 16:54:03.454743999 +0000 UTC m=+1351.234789417" watchObservedRunningTime="2026-02-20 16:54:03.458579076 +0000 UTC m=+1351.238624514" Feb 20 16:54:04 crc kubenswrapper[4697]: I0220 16:54:04.640366 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 16:54:04 crc kubenswrapper[4697]: I0220 16:54:04.640840 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 16:54:05 crc kubenswrapper[4697]: I0220 16:54:05.723623 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.223:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 16:54:05 crc kubenswrapper[4697]: I0220 16:54:05.724319 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.223:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 16:54:05 crc kubenswrapper[4697]: I0220 16:54:05.726306 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 20 16:54:06 crc kubenswrapper[4697]: I0220 16:54:06.743600 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 16:54:06 crc kubenswrapper[4697]: I0220 16:54:06.777669 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 16:54:07 crc kubenswrapper[4697]: I0220 16:54:07.504892 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 16:54:08 crc kubenswrapper[4697]: I0220 16:54:08.978659 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.170817 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-sg-core-conf-yaml\") pod \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.170896 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5pdl\" (UniqueName: \"kubernetes.io/projected/fecc4170-b5a7-4f50-b0c5-99408fae97a8-kube-api-access-n5pdl\") pod \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.171022 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-config-data\") pod \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.171125 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fecc4170-b5a7-4f50-b0c5-99408fae97a8-log-httpd\") pod \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.171174 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-scripts\") pod \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.171235 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fecc4170-b5a7-4f50-b0c5-99408fae97a8-run-httpd\") pod \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.171327 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-combined-ca-bundle\") pod \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\" (UID: \"fecc4170-b5a7-4f50-b0c5-99408fae97a8\") " Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.182116 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fecc4170-b5a7-4f50-b0c5-99408fae97a8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fecc4170-b5a7-4f50-b0c5-99408fae97a8" (UID: "fecc4170-b5a7-4f50-b0c5-99408fae97a8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.183187 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fecc4170-b5a7-4f50-b0c5-99408fae97a8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fecc4170-b5a7-4f50-b0c5-99408fae97a8" (UID: "fecc4170-b5a7-4f50-b0c5-99408fae97a8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.183814 4697 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fecc4170-b5a7-4f50-b0c5-99408fae97a8-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.183830 4697 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fecc4170-b5a7-4f50-b0c5-99408fae97a8-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.295257 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-scripts" (OuterVolumeSpecName: "scripts") pod "fecc4170-b5a7-4f50-b0c5-99408fae97a8" (UID: "fecc4170-b5a7-4f50-b0c5-99408fae97a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.296077 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fecc4170-b5a7-4f50-b0c5-99408fae97a8-kube-api-access-n5pdl" (OuterVolumeSpecName: "kube-api-access-n5pdl") pod "fecc4170-b5a7-4f50-b0c5-99408fae97a8" (UID: "fecc4170-b5a7-4f50-b0c5-99408fae97a8"). InnerVolumeSpecName "kube-api-access-n5pdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.301449 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fecc4170-b5a7-4f50-b0c5-99408fae97a8" (UID: "fecc4170-b5a7-4f50-b0c5-99408fae97a8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.373023 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-config-data" (OuterVolumeSpecName: "config-data") pod "fecc4170-b5a7-4f50-b0c5-99408fae97a8" (UID: "fecc4170-b5a7-4f50-b0c5-99408fae97a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.380888 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fecc4170-b5a7-4f50-b0c5-99408fae97a8" (UID: "fecc4170-b5a7-4f50-b0c5-99408fae97a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.387668 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.387709 4697 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.387725 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5pdl\" (UniqueName: \"kubernetes.io/projected/fecc4170-b5a7-4f50-b0c5-99408fae97a8-kube-api-access-n5pdl\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.387741 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.387752 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fecc4170-b5a7-4f50-b0c5-99408fae97a8-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.498175 4697 generic.go:334] "Generic (PLEG): container finished" podID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerID="717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931" exitCode=0 Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.498228 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fecc4170-b5a7-4f50-b0c5-99408fae97a8","Type":"ContainerDied","Data":"717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931"} Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.498241 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.498259 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fecc4170-b5a7-4f50-b0c5-99408fae97a8","Type":"ContainerDied","Data":"08002d44bcc7aaa48089e460d56ac48d089639b7b2ebaffe531fbf5c69c841f7"} Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.498280 4697 scope.go:117] "RemoveContainer" containerID="2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.520791 4697 scope.go:117] "RemoveContainer" containerID="eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.541572 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.546773 4697 scope.go:117] "RemoveContainer" containerID="717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.580629 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.585917 4697 scope.go:117] "RemoveContainer" containerID="f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.604722 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:54:09 crc kubenswrapper[4697]: E0220 16:54:09.606197 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="ceilometer-notification-agent" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.606342 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="ceilometer-notification-agent" Feb 20 16:54:09 crc kubenswrapper[4697]: E0220 16:54:09.606518 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="proxy-httpd" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.606601 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="proxy-httpd" Feb 20 16:54:09 crc kubenswrapper[4697]: E0220 16:54:09.606715 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="ceilometer-central-agent" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.606787 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="ceilometer-central-agent" Feb 20 16:54:09 crc kubenswrapper[4697]: E0220 16:54:09.606879 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="sg-core" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.606959 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="sg-core" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.607615 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="proxy-httpd" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.607813 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="sg-core" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.607947 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="ceilometer-central-agent" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.608053 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" containerName="ceilometer-notification-agent" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.612937 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.636153 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.636459 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.636748 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.645415 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.662572 4697 scope.go:117] "RemoveContainer" containerID="2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463" Feb 20 16:54:09 crc kubenswrapper[4697]: E0220 16:54:09.663130 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463\": container with ID starting with 2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463 not found: ID does not exist" containerID="2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.663163 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463"} err="failed to get container status \"2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463\": rpc error: code = NotFound desc = could not find container \"2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463\": container with ID starting with 2a855021d27b801f4b686acd8eb502ba4aae6d7b61d5bb64b1fd88e04f8f8463 not found: ID does not exist" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.663189 4697 scope.go:117] "RemoveContainer" containerID="eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346" Feb 20 16:54:09 crc kubenswrapper[4697]: E0220 16:54:09.663493 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346\": container with ID starting with eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346 not found: ID does not exist" containerID="eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.663514 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346"} err="failed to get container status \"eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346\": rpc error: code = NotFound desc = could not find container \"eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346\": container with ID starting with eaa019c5ee33e917029ccb88634271044f59ddd765fe09d5893925ecbe37e346 not found: ID does not exist" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.663544 4697 scope.go:117] "RemoveContainer" containerID="717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931" Feb 20 16:54:09 crc kubenswrapper[4697]: E0220 16:54:09.663876 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931\": container with ID starting with 717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931 not found: ID does not exist" containerID="717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.663933 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931"} err="failed to get container status \"717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931\": rpc error: code = NotFound desc = could not find container \"717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931\": container with ID starting with 717a0a4c8305365e5e6c9e8ef83bfb009736c3cbff49f1482deddf6febe30931 not found: ID does not exist" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.663971 4697 scope.go:117] "RemoveContainer" containerID="f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182" Feb 20 16:54:09 crc kubenswrapper[4697]: E0220 16:54:09.664400 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182\": container with ID starting with f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182 not found: ID does not exist" containerID="f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.664426 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182"} err="failed to get container status \"f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182\": rpc error: code = NotFound desc = could not find container \"f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182\": container with ID starting with f4dbf60edcad56cb08b7007d9141638cbf367fd5faa554b5587bea06f9d1b182 not found: ID does not exist" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.805321 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d13cf2a-7046-4102-80aa-c3f2eff127c0-run-httpd\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.805402 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.805468 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-scripts\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.805525 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.805547 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d13cf2a-7046-4102-80aa-c3f2eff127c0-log-httpd\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.805567 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-config-data\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.805614 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.806031 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdvrs\" (UniqueName: \"kubernetes.io/projected/8d13cf2a-7046-4102-80aa-c3f2eff127c0-kube-api-access-gdvrs\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.907465 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.907529 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-scripts\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.907577 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.907600 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d13cf2a-7046-4102-80aa-c3f2eff127c0-log-httpd\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.907628 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-config-data\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.907681 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.907724 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdvrs\" (UniqueName: \"kubernetes.io/projected/8d13cf2a-7046-4102-80aa-c3f2eff127c0-kube-api-access-gdvrs\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.907794 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d13cf2a-7046-4102-80aa-c3f2eff127c0-run-httpd\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.908285 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d13cf2a-7046-4102-80aa-c3f2eff127c0-run-httpd\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.908702 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d13cf2a-7046-4102-80aa-c3f2eff127c0-log-httpd\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.912186 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-scripts\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.912903 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-config-data\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.913528 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.916292 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.925151 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdvrs\" (UniqueName: \"kubernetes.io/projected/8d13cf2a-7046-4102-80aa-c3f2eff127c0-kube-api-access-gdvrs\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.927338 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " pod="openstack/ceilometer-0" Feb 20 16:54:09 crc kubenswrapper[4697]: I0220 16:54:09.957518 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:54:10 crc kubenswrapper[4697]: I0220 16:54:10.434235 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:54:10 crc kubenswrapper[4697]: I0220 16:54:10.511281 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d13cf2a-7046-4102-80aa-c3f2eff127c0","Type":"ContainerStarted","Data":"e1326422a56f0327c6fa0d7feef602d6d6f87e5502bf175fbd421556cb225012"} Feb 20 16:54:10 crc kubenswrapper[4697]: I0220 16:54:10.889068 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fecc4170-b5a7-4f50-b0c5-99408fae97a8" path="/var/lib/kubelet/pods/fecc4170-b5a7-4f50-b0c5-99408fae97a8/volumes" Feb 20 16:54:11 crc kubenswrapper[4697]: I0220 16:54:11.526373 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d13cf2a-7046-4102-80aa-c3f2eff127c0","Type":"ContainerStarted","Data":"9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6"} Feb 20 16:54:11 crc kubenswrapper[4697]: I0220 16:54:11.526752 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d13cf2a-7046-4102-80aa-c3f2eff127c0","Type":"ContainerStarted","Data":"25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d"} Feb 20 16:54:11 crc kubenswrapper[4697]: I0220 16:54:11.666424 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 16:54:11 crc kubenswrapper[4697]: I0220 16:54:11.667000 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 16:54:11 crc kubenswrapper[4697]: I0220 16:54:11.673029 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 16:54:11 crc kubenswrapper[4697]: I0220 16:54:11.779187 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 20 16:54:12 crc kubenswrapper[4697]: I0220 16:54:12.539486 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d13cf2a-7046-4102-80aa-c3f2eff127c0","Type":"ContainerStarted","Data":"688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904"} Feb 20 16:54:12 crc kubenswrapper[4697]: I0220 16:54:12.545531 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.519333 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.550539 4697 generic.go:334] "Generic (PLEG): container finished" podID="e8ff77f1-cb68-4ab0-ade5-1d692fff2039" containerID="91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879" exitCode=137 Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.551668 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.552468 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e8ff77f1-cb68-4ab0-ade5-1d692fff2039","Type":"ContainerDied","Data":"91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879"} Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.552505 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e8ff77f1-cb68-4ab0-ade5-1d692fff2039","Type":"ContainerDied","Data":"cbbbd63089b74a554bb6496439c5765ebb4c43fc644d34fabd0c95fb6db3db06"} Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.552522 4697 scope.go:117] "RemoveContainer" containerID="91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.588355 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-combined-ca-bundle\") pod \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\" (UID: \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\") " Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.588397 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-config-data\") pod \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\" (UID: \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\") " Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.588546 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqr74\" (UniqueName: \"kubernetes.io/projected/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-kube-api-access-wqr74\") pod \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\" (UID: \"e8ff77f1-cb68-4ab0-ade5-1d692fff2039\") " Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.595945 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-kube-api-access-wqr74" (OuterVolumeSpecName: "kube-api-access-wqr74") pod "e8ff77f1-cb68-4ab0-ade5-1d692fff2039" (UID: "e8ff77f1-cb68-4ab0-ade5-1d692fff2039"). InnerVolumeSpecName "kube-api-access-wqr74". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.629241 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8ff77f1-cb68-4ab0-ade5-1d692fff2039" (UID: "e8ff77f1-cb68-4ab0-ade5-1d692fff2039"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.645595 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-config-data" (OuterVolumeSpecName: "config-data") pod "e8ff77f1-cb68-4ab0-ade5-1d692fff2039" (UID: "e8ff77f1-cb68-4ab0-ade5-1d692fff2039"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.648058 4697 scope.go:117] "RemoveContainer" containerID="91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879" Feb 20 16:54:13 crc kubenswrapper[4697]: E0220 16:54:13.648324 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879\": container with ID starting with 91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879 not found: ID does not exist" containerID="91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.648351 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879"} err="failed to get container status \"91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879\": rpc error: code = NotFound desc = could not find container \"91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879\": container with ID starting with 91536eae69746d5e878829ec8c9d7fc422102e35509d1ee3dd8f44ee03160879 not found: ID does not exist" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.690958 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqr74\" (UniqueName: \"kubernetes.io/projected/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-kube-api-access-wqr74\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.690985 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.690996 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ff77f1-cb68-4ab0-ade5-1d692fff2039-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.881450 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.893768 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.919201 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 16:54:13 crc kubenswrapper[4697]: E0220 16:54:13.919768 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ff77f1-cb68-4ab0-ade5-1d692fff2039" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.919794 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ff77f1-cb68-4ab0-ade5-1d692fff2039" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.920040 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ff77f1-cb68-4ab0-ade5-1d692fff2039" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.920956 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.926884 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.927045 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.927363 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.931270 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.997771 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.997863 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfjzx\" (UniqueName: \"kubernetes.io/projected/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-kube-api-access-cfjzx\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.997892 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.998029 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:13 crc kubenswrapper[4697]: I0220 16:54:13.998144 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.099968 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.100085 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.100120 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfjzx\" (UniqueName: \"kubernetes.io/projected/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-kube-api-access-cfjzx\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.100834 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.100946 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.104620 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.104895 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.105387 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.106156 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.131600 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfjzx\" (UniqueName: \"kubernetes.io/projected/4b1ddcd6-abc4-467c-8cd1-1937c803e0b4-kube-api-access-cfjzx\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.237784 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.563284 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d13cf2a-7046-4102-80aa-c3f2eff127c0","Type":"ContainerStarted","Data":"21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65"} Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.563887 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.585929 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5037164130000003 podStartE2EDuration="5.585911276s" podCreationTimestamp="2026-02-20 16:54:09 +0000 UTC" firstStartedPulling="2026-02-20 16:54:10.442360046 +0000 UTC m=+1358.222405464" lastFinishedPulling="2026-02-20 16:54:13.524554919 +0000 UTC m=+1361.304600327" observedRunningTime="2026-02-20 16:54:14.57945298 +0000 UTC m=+1362.359498408" watchObservedRunningTime="2026-02-20 16:54:14.585911276 +0000 UTC m=+1362.365956684" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.645827 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.646172 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.649501 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.653827 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.715790 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 16:54:14 crc kubenswrapper[4697]: I0220 16:54:14.896265 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ff77f1-cb68-4ab0-ade5-1d692fff2039" path="/var/lib/kubelet/pods/e8ff77f1-cb68-4ab0-ade5-1d692fff2039/volumes" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.575591 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4","Type":"ContainerStarted","Data":"6873bd153b03dea3f44ae6e964a0bfa7b1dffef3a0b577784df13cb92d10dac9"} Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.575998 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4b1ddcd6-abc4-467c-8cd1-1937c803e0b4","Type":"ContainerStarted","Data":"ddcf73683319cd1b5cc56b62ddc82f954abff544eebc56080c30e4c983deb7f8"} Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.576366 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.583251 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.598288 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.598265119 podStartE2EDuration="2.598265119s" podCreationTimestamp="2026-02-20 16:54:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:54:15.593261347 +0000 UTC m=+1363.373306755" watchObservedRunningTime="2026-02-20 16:54:15.598265119 +0000 UTC m=+1363.378310537" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.768195 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f7757544c-2rfnw"] Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.770381 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.794237 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f7757544c-2rfnw"] Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.844735 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvq74\" (UniqueName: \"kubernetes.io/projected/4638cd5a-b95f-482f-b554-cd95fddfa551-kube-api-access-kvq74\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.845027 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-ovsdbserver-nb\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.845189 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-config\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.845278 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-ovsdbserver-sb\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.845365 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-dns-swift-storage-0\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.845484 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-dns-svc\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.947778 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-config\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.947837 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-ovsdbserver-sb\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.947879 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-dns-swift-storage-0\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.947959 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-dns-svc\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.948002 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvq74\" (UniqueName: \"kubernetes.io/projected/4638cd5a-b95f-482f-b554-cd95fddfa551-kube-api-access-kvq74\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.948036 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-ovsdbserver-nb\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.948717 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-dns-swift-storage-0\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.949253 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-config\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.949480 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-ovsdbserver-nb\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.949579 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-ovsdbserver-sb\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.950117 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-dns-svc\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:15 crc kubenswrapper[4697]: I0220 16:54:15.975875 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvq74\" (UniqueName: \"kubernetes.io/projected/4638cd5a-b95f-482f-b554-cd95fddfa551-kube-api-access-kvq74\") pod \"dnsmasq-dns-7f7757544c-2rfnw\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:16 crc kubenswrapper[4697]: I0220 16:54:16.107899 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:16 crc kubenswrapper[4697]: I0220 16:54:16.652221 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f7757544c-2rfnw"] Feb 20 16:54:16 crc kubenswrapper[4697]: W0220 16:54:16.665673 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4638cd5a_b95f_482f_b554_cd95fddfa551.slice/crio-77a6a1b07f1ec54a68ba9528b961bcedbb245f68efc20c6cbc63255239ec56a3 WatchSource:0}: Error finding container 77a6a1b07f1ec54a68ba9528b961bcedbb245f68efc20c6cbc63255239ec56a3: Status 404 returned error can't find the container with id 77a6a1b07f1ec54a68ba9528b961bcedbb245f68efc20c6cbc63255239ec56a3 Feb 20 16:54:17 crc kubenswrapper[4697]: I0220 16:54:17.467669 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:54:17 crc kubenswrapper[4697]: I0220 16:54:17.467986 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="ceilometer-central-agent" containerID="cri-o://25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d" gracePeriod=30 Feb 20 16:54:17 crc kubenswrapper[4697]: I0220 16:54:17.468086 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="sg-core" containerID="cri-o://688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904" gracePeriod=30 Feb 20 16:54:17 crc kubenswrapper[4697]: I0220 16:54:17.468081 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="proxy-httpd" containerID="cri-o://21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65" gracePeriod=30 Feb 20 16:54:17 crc kubenswrapper[4697]: I0220 16:54:17.468159 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="ceilometer-notification-agent" containerID="cri-o://9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6" gracePeriod=30 Feb 20 16:54:17 crc kubenswrapper[4697]: I0220 16:54:17.596912 4697 generic.go:334] "Generic (PLEG): container finished" podID="4638cd5a-b95f-482f-b554-cd95fddfa551" containerID="903defd0b77d02cfd99bcfd7f32d7e5c35d0070e52a758a1d595c926fd828c50" exitCode=0 Feb 20 16:54:17 crc kubenswrapper[4697]: I0220 16:54:17.597004 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" event={"ID":"4638cd5a-b95f-482f-b554-cd95fddfa551","Type":"ContainerDied","Data":"903defd0b77d02cfd99bcfd7f32d7e5c35d0070e52a758a1d595c926fd828c50"} Feb 20 16:54:17 crc kubenswrapper[4697]: I0220 16:54:17.597321 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" event={"ID":"4638cd5a-b95f-482f-b554-cd95fddfa551","Type":"ContainerStarted","Data":"77a6a1b07f1ec54a68ba9528b961bcedbb245f68efc20c6cbc63255239ec56a3"} Feb 20 16:54:17 crc kubenswrapper[4697]: I0220 16:54:17.915362 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:54:18 crc kubenswrapper[4697]: I0220 16:54:18.608554 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" event={"ID":"4638cd5a-b95f-482f-b554-cd95fddfa551","Type":"ContainerStarted","Data":"6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463"} Feb 20 16:54:18 crc kubenswrapper[4697]: I0220 16:54:18.609491 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:18 crc kubenswrapper[4697]: I0220 16:54:18.610762 4697 generic.go:334] "Generic (PLEG): container finished" podID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerID="21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65" exitCode=0 Feb 20 16:54:18 crc kubenswrapper[4697]: I0220 16:54:18.610794 4697 generic.go:334] "Generic (PLEG): container finished" podID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerID="688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904" exitCode=2 Feb 20 16:54:18 crc kubenswrapper[4697]: I0220 16:54:18.610806 4697 generic.go:334] "Generic (PLEG): container finished" podID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerID="25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d" exitCode=0 Feb 20 16:54:18 crc kubenswrapper[4697]: I0220 16:54:18.610992 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" containerName="nova-api-log" containerID="cri-o://0e5b77b346c61e9bf02571ba4b360b3d290b85b0864ba7edc44d1a116bf85a1f" gracePeriod=30 Feb 20 16:54:18 crc kubenswrapper[4697]: I0220 16:54:18.611070 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d13cf2a-7046-4102-80aa-c3f2eff127c0","Type":"ContainerDied","Data":"21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65"} Feb 20 16:54:18 crc kubenswrapper[4697]: I0220 16:54:18.611105 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d13cf2a-7046-4102-80aa-c3f2eff127c0","Type":"ContainerDied","Data":"688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904"} Feb 20 16:54:18 crc kubenswrapper[4697]: I0220 16:54:18.611123 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d13cf2a-7046-4102-80aa-c3f2eff127c0","Type":"ContainerDied","Data":"25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d"} Feb 20 16:54:18 crc kubenswrapper[4697]: I0220 16:54:18.611182 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" containerName="nova-api-api" containerID="cri-o://3f31414cefa11dd99157f117dc9b56bbf6c962f8588f150161df2bd61702f4f0" gracePeriod=30 Feb 20 16:54:18 crc kubenswrapper[4697]: I0220 16:54:18.645030 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" podStartSLOduration=3.645007054 podStartE2EDuration="3.645007054s" podCreationTimestamp="2026-02-20 16:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:54:18.628651207 +0000 UTC m=+1366.408696625" watchObservedRunningTime="2026-02-20 16:54:18.645007054 +0000 UTC m=+1366.425052462" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.238777 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.271923 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.328806 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-config-data\") pod \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.328863 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d13cf2a-7046-4102-80aa-c3f2eff127c0-run-httpd\") pod \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.328901 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-combined-ca-bundle\") pod \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.328978 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-ceilometer-tls-certs\") pod \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.329062 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdvrs\" (UniqueName: \"kubernetes.io/projected/8d13cf2a-7046-4102-80aa-c3f2eff127c0-kube-api-access-gdvrs\") pod \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.329085 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-scripts\") pod \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.329167 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-sg-core-conf-yaml\") pod \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.329196 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d13cf2a-7046-4102-80aa-c3f2eff127c0-log-httpd\") pod \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\" (UID: \"8d13cf2a-7046-4102-80aa-c3f2eff127c0\") " Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.329190 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d13cf2a-7046-4102-80aa-c3f2eff127c0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8d13cf2a-7046-4102-80aa-c3f2eff127c0" (UID: "8d13cf2a-7046-4102-80aa-c3f2eff127c0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.329628 4697 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d13cf2a-7046-4102-80aa-c3f2eff127c0-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.329927 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d13cf2a-7046-4102-80aa-c3f2eff127c0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8d13cf2a-7046-4102-80aa-c3f2eff127c0" (UID: "8d13cf2a-7046-4102-80aa-c3f2eff127c0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.342671 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d13cf2a-7046-4102-80aa-c3f2eff127c0-kube-api-access-gdvrs" (OuterVolumeSpecName: "kube-api-access-gdvrs") pod "8d13cf2a-7046-4102-80aa-c3f2eff127c0" (UID: "8d13cf2a-7046-4102-80aa-c3f2eff127c0"). InnerVolumeSpecName "kube-api-access-gdvrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.342752 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-scripts" (OuterVolumeSpecName: "scripts") pod "8d13cf2a-7046-4102-80aa-c3f2eff127c0" (UID: "8d13cf2a-7046-4102-80aa-c3f2eff127c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.387787 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8d13cf2a-7046-4102-80aa-c3f2eff127c0" (UID: "8d13cf2a-7046-4102-80aa-c3f2eff127c0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.411657 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8d13cf2a-7046-4102-80aa-c3f2eff127c0" (UID: "8d13cf2a-7046-4102-80aa-c3f2eff127c0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.417495 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d13cf2a-7046-4102-80aa-c3f2eff127c0" (UID: "8d13cf2a-7046-4102-80aa-c3f2eff127c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.430940 4697 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.430973 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdvrs\" (UniqueName: \"kubernetes.io/projected/8d13cf2a-7046-4102-80aa-c3f2eff127c0-kube-api-access-gdvrs\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.430985 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.430996 4697 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.431004 4697 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d13cf2a-7046-4102-80aa-c3f2eff127c0-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.431012 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.444390 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-config-data" (OuterVolumeSpecName: "config-data") pod "8d13cf2a-7046-4102-80aa-c3f2eff127c0" (UID: "8d13cf2a-7046-4102-80aa-c3f2eff127c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.532720 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d13cf2a-7046-4102-80aa-c3f2eff127c0-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.625217 4697 generic.go:334] "Generic (PLEG): container finished" podID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" containerID="3f31414cefa11dd99157f117dc9b56bbf6c962f8588f150161df2bd61702f4f0" exitCode=0 Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.625245 4697 generic.go:334] "Generic (PLEG): container finished" podID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" containerID="0e5b77b346c61e9bf02571ba4b360b3d290b85b0864ba7edc44d1a116bf85a1f" exitCode=143 Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.625334 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"92b2a1ec-bc80-46bc-bb32-23fb81c317f0","Type":"ContainerDied","Data":"3f31414cefa11dd99157f117dc9b56bbf6c962f8588f150161df2bd61702f4f0"} Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.625459 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"92b2a1ec-bc80-46bc-bb32-23fb81c317f0","Type":"ContainerDied","Data":"0e5b77b346c61e9bf02571ba4b360b3d290b85b0864ba7edc44d1a116bf85a1f"} Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.627693 4697 generic.go:334] "Generic (PLEG): container finished" podID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerID="9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6" exitCode=0 Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.628285 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.628747 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d13cf2a-7046-4102-80aa-c3f2eff127c0","Type":"ContainerDied","Data":"9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6"} Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.628782 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d13cf2a-7046-4102-80aa-c3f2eff127c0","Type":"ContainerDied","Data":"e1326422a56f0327c6fa0d7feef602d6d6f87e5502bf175fbd421556cb225012"} Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.628799 4697 scope.go:117] "RemoveContainer" containerID="21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.681715 4697 scope.go:117] "RemoveContainer" containerID="688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.682157 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.692324 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.706529 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:54:19 crc kubenswrapper[4697]: E0220 16:54:19.706999 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="proxy-httpd" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.707021 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="proxy-httpd" Feb 20 16:54:19 crc kubenswrapper[4697]: E0220 16:54:19.707043 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="ceilometer-central-agent" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.707052 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="ceilometer-central-agent" Feb 20 16:54:19 crc kubenswrapper[4697]: E0220 16:54:19.707094 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="ceilometer-notification-agent" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.707102 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="ceilometer-notification-agent" Feb 20 16:54:19 crc kubenswrapper[4697]: E0220 16:54:19.707120 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="sg-core" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.707129 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="sg-core" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.707346 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="ceilometer-central-agent" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.707367 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="sg-core" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.707391 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="proxy-httpd" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.707405 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" containerName="ceilometer-notification-agent" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.709517 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.711562 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.711778 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.712340 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.740204 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.741688 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.742183 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.742247 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3336f2b-7449-4ad5-9696-7565e147beab-log-httpd\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.742334 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.742371 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3336f2b-7449-4ad5-9696-7565e147beab-run-httpd\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.742465 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-config-data\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.742776 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6k54\" (UniqueName: \"kubernetes.io/projected/a3336f2b-7449-4ad5-9696-7565e147beab-kube-api-access-f6k54\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.742813 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-scripts\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.743253 4697 scope.go:117] "RemoveContainer" containerID="9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.814163 4697 scope.go:117] "RemoveContainer" containerID="25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.837620 4697 scope.go:117] "RemoveContainer" containerID="21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65" Feb 20 16:54:19 crc kubenswrapper[4697]: E0220 16:54:19.838079 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65\": container with ID starting with 21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65 not found: ID does not exist" containerID="21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.838125 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65"} err="failed to get container status \"21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65\": rpc error: code = NotFound desc = could not find container \"21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65\": container with ID starting with 21d299fa9ccde75ee2e473de74667fa2cd64d7b19c42d6cd687835ec9e0a9a65 not found: ID does not exist" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.838154 4697 scope.go:117] "RemoveContainer" containerID="688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904" Feb 20 16:54:19 crc kubenswrapper[4697]: E0220 16:54:19.839289 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904\": container with ID starting with 688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904 not found: ID does not exist" containerID="688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.839346 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904"} err="failed to get container status \"688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904\": rpc error: code = NotFound desc = could not find container \"688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904\": container with ID starting with 688936d93f9e65db1526df5c4dc573076163aeb0116f66aac7ab1ab6ea543904 not found: ID does not exist" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.839376 4697 scope.go:117] "RemoveContainer" containerID="9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6" Feb 20 16:54:19 crc kubenswrapper[4697]: E0220 16:54:19.839720 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6\": container with ID starting with 9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6 not found: ID does not exist" containerID="9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.839739 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6"} err="failed to get container status \"9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6\": rpc error: code = NotFound desc = could not find container \"9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6\": container with ID starting with 9c13b287cd21343a955bbedc7ea31c8ae075f0889c85425583ae54babc798fe6 not found: ID does not exist" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.839755 4697 scope.go:117] "RemoveContainer" containerID="25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d" Feb 20 16:54:19 crc kubenswrapper[4697]: E0220 16:54:19.839995 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d\": container with ID starting with 25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d not found: ID does not exist" containerID="25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.840023 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d"} err="failed to get container status \"25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d\": rpc error: code = NotFound desc = could not find container \"25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d\": container with ID starting with 25e5d2652a75b1132804c132bb8e36db5403e1f99c92f480286cd8f95bba758d not found: ID does not exist" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.844630 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.844674 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3336f2b-7449-4ad5-9696-7565e147beab-log-httpd\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.844708 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.844726 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3336f2b-7449-4ad5-9696-7565e147beab-run-httpd\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.844753 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-config-data\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.844780 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6k54\" (UniqueName: \"kubernetes.io/projected/a3336f2b-7449-4ad5-9696-7565e147beab-kube-api-access-f6k54\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.844798 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-scripts\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.844826 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.845421 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3336f2b-7449-4ad5-9696-7565e147beab-log-httpd\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.845516 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3336f2b-7449-4ad5-9696-7565e147beab-run-httpd\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.851125 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-scripts\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.852147 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.852676 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.852905 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-config-data\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.853222 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3336f2b-7449-4ad5-9696-7565e147beab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.861396 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6k54\" (UniqueName: \"kubernetes.io/projected/a3336f2b-7449-4ad5-9696-7565e147beab-kube-api-access-f6k54\") pod \"ceilometer-0\" (UID: \"a3336f2b-7449-4ad5-9696-7565e147beab\") " pod="openstack/ceilometer-0" Feb 20 16:54:19 crc kubenswrapper[4697]: I0220 16:54:19.960533 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.050873 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-logs\") pod \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.050982 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-combined-ca-bundle\") pod \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.051034 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-config-data\") pod \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.051069 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2z7f\" (UniqueName: \"kubernetes.io/projected/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-kube-api-access-b2z7f\") pod \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\" (UID: \"92b2a1ec-bc80-46bc-bb32-23fb81c317f0\") " Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.051895 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-logs" (OuterVolumeSpecName: "logs") pod "92b2a1ec-bc80-46bc-bb32-23fb81c317f0" (UID: "92b2a1ec-bc80-46bc-bb32-23fb81c317f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.059597 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-kube-api-access-b2z7f" (OuterVolumeSpecName: "kube-api-access-b2z7f") pod "92b2a1ec-bc80-46bc-bb32-23fb81c317f0" (UID: "92b2a1ec-bc80-46bc-bb32-23fb81c317f0"). InnerVolumeSpecName "kube-api-access-b2z7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.111508 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.128387 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92b2a1ec-bc80-46bc-bb32-23fb81c317f0" (UID: "92b2a1ec-bc80-46bc-bb32-23fb81c317f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.128627 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-config-data" (OuterVolumeSpecName: "config-data") pod "92b2a1ec-bc80-46bc-bb32-23fb81c317f0" (UID: "92b2a1ec-bc80-46bc-bb32-23fb81c317f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.156671 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.156704 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.156716 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.156726 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2z7f\" (UniqueName: \"kubernetes.io/projected/92b2a1ec-bc80-46bc-bb32-23fb81c317f0-kube-api-access-b2z7f\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:20 crc kubenswrapper[4697]: W0220 16:54:20.568353 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3336f2b_7449_4ad5_9696_7565e147beab.slice/crio-1c4cb0153453b875639942b2558e4186d149b938a2397ebd6436326b6739eace WatchSource:0}: Error finding container 1c4cb0153453b875639942b2558e4186d149b938a2397ebd6436326b6739eace: Status 404 returned error can't find the container with id 1c4cb0153453b875639942b2558e4186d149b938a2397ebd6436326b6739eace Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.569195 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.640509 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.640501 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"92b2a1ec-bc80-46bc-bb32-23fb81c317f0","Type":"ContainerDied","Data":"224cbcfb85a3f4b9e734ff1b3e4fa9467204bfcc56bedf37cafb3e834bbe093a"} Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.640654 4697 scope.go:117] "RemoveContainer" containerID="3f31414cefa11dd99157f117dc9b56bbf6c962f8588f150161df2bd61702f4f0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.642672 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3336f2b-7449-4ad5-9696-7565e147beab","Type":"ContainerStarted","Data":"1c4cb0153453b875639942b2558e4186d149b938a2397ebd6436326b6739eace"} Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.670044 4697 scope.go:117] "RemoveContainer" containerID="0e5b77b346c61e9bf02571ba4b360b3d290b85b0864ba7edc44d1a116bf85a1f" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.680653 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.689497 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.730017 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 16:54:20 crc kubenswrapper[4697]: E0220 16:54:20.730552 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" containerName="nova-api-api" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.730569 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" containerName="nova-api-api" Feb 20 16:54:20 crc kubenswrapper[4697]: E0220 16:54:20.730611 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" containerName="nova-api-log" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.730620 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" containerName="nova-api-log" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.730825 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" containerName="nova-api-api" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.730855 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" containerName="nova-api-log" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.732074 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.737821 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.738071 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.738229 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.779060 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qjqk\" (UniqueName: \"kubernetes.io/projected/e498ea7e-2c61-4388-939b-b9565ee47a24-kube-api-access-6qjqk\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.779566 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.779622 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-config-data\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.779642 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-public-tls-certs\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.779770 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.779842 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e498ea7e-2c61-4388-939b-b9565ee47a24-logs\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.782754 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.882077 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.882227 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-config-data\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.882253 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-public-tls-certs\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.882314 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.882455 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e498ea7e-2c61-4388-939b-b9565ee47a24-logs\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.882483 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qjqk\" (UniqueName: \"kubernetes.io/projected/e498ea7e-2c61-4388-939b-b9565ee47a24-kube-api-access-6qjqk\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.882872 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e498ea7e-2c61-4388-939b-b9565ee47a24-logs\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.888954 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d13cf2a-7046-4102-80aa-c3f2eff127c0" path="/var/lib/kubelet/pods/8d13cf2a-7046-4102-80aa-c3f2eff127c0/volumes" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.889603 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.889761 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92b2a1ec-bc80-46bc-bb32-23fb81c317f0" path="/var/lib/kubelet/pods/92b2a1ec-bc80-46bc-bb32-23fb81c317f0/volumes" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.897055 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-config-data\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.898708 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qjqk\" (UniqueName: \"kubernetes.io/projected/e498ea7e-2c61-4388-939b-b9565ee47a24-kube-api-access-6qjqk\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.899144 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:20 crc kubenswrapper[4697]: I0220 16:54:20.900535 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-public-tls-certs\") pod \"nova-api-0\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " pod="openstack/nova-api-0" Feb 20 16:54:21 crc kubenswrapper[4697]: I0220 16:54:21.119801 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:54:21 crc kubenswrapper[4697]: I0220 16:54:21.644742 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:54:21 crc kubenswrapper[4697]: W0220 16:54:21.660353 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode498ea7e_2c61_4388_939b_b9565ee47a24.slice/crio-503a74521c9c232fdb30a9b8f214551a7d4fd117de6c88bb2bd8ca92b4adb816 WatchSource:0}: Error finding container 503a74521c9c232fdb30a9b8f214551a7d4fd117de6c88bb2bd8ca92b4adb816: Status 404 returned error can't find the container with id 503a74521c9c232fdb30a9b8f214551a7d4fd117de6c88bb2bd8ca92b4adb816 Feb 20 16:54:21 crc kubenswrapper[4697]: I0220 16:54:21.660427 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3336f2b-7449-4ad5-9696-7565e147beab","Type":"ContainerStarted","Data":"68701af22fefa2032344c0e99d02ceb10459c8e490b74d68f202969671c27912"} Feb 20 16:54:21 crc kubenswrapper[4697]: I0220 16:54:21.660484 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3336f2b-7449-4ad5-9696-7565e147beab","Type":"ContainerStarted","Data":"1b149aa0d42504439403a2a52bdfb4e4c6a18c7728a2a2aba2744c6408eeebf9"} Feb 20 16:54:22 crc kubenswrapper[4697]: I0220 16:54:22.670166 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e498ea7e-2c61-4388-939b-b9565ee47a24","Type":"ContainerStarted","Data":"7cfd74243b0a1a8d7fffdf3745674c37af535baf18055296d3eb6dcb6033f36a"} Feb 20 16:54:22 crc kubenswrapper[4697]: I0220 16:54:22.670797 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e498ea7e-2c61-4388-939b-b9565ee47a24","Type":"ContainerStarted","Data":"e4ad22f2ab20c38be8cbc9642c40100cf2ac496b60803b057247012223be0ed3"} Feb 20 16:54:22 crc kubenswrapper[4697]: I0220 16:54:22.670818 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e498ea7e-2c61-4388-939b-b9565ee47a24","Type":"ContainerStarted","Data":"503a74521c9c232fdb30a9b8f214551a7d4fd117de6c88bb2bd8ca92b4adb816"} Feb 20 16:54:22 crc kubenswrapper[4697]: I0220 16:54:22.672276 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3336f2b-7449-4ad5-9696-7565e147beab","Type":"ContainerStarted","Data":"10fc0b15af9675c0a4564f94a3bc54f4925b6dad585ef021ec118f8a30e6a4c1"} Feb 20 16:54:22 crc kubenswrapper[4697]: I0220 16:54:22.692106 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.69209102 podStartE2EDuration="2.69209102s" podCreationTimestamp="2026-02-20 16:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:54:22.68755102 +0000 UTC m=+1370.467596428" watchObservedRunningTime="2026-02-20 16:54:22.69209102 +0000 UTC m=+1370.472136428" Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.238641 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.256658 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.691690 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3336f2b-7449-4ad5-9696-7565e147beab","Type":"ContainerStarted","Data":"5e5e6fa7880b7d6302631368181624ba7f4495f4a828ab08a4799993bd80d7f7"} Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.691743 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.707756 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.718490 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.413103892 podStartE2EDuration="5.718471716s" podCreationTimestamp="2026-02-20 16:54:19 +0000 UTC" firstStartedPulling="2026-02-20 16:54:20.570382661 +0000 UTC m=+1368.350428069" lastFinishedPulling="2026-02-20 16:54:23.875750445 +0000 UTC m=+1371.655795893" observedRunningTime="2026-02-20 16:54:24.712815719 +0000 UTC m=+1372.492861127" watchObservedRunningTime="2026-02-20 16:54:24.718471716 +0000 UTC m=+1372.498517124" Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.863197 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-rhzdd"] Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.866516 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.870222 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.870418 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.892489 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rhzdd"] Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.974121 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rhzdd\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.974223 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-scripts\") pod \"nova-cell1-cell-mapping-rhzdd\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.974255 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d6h5\" (UniqueName: \"kubernetes.io/projected/e29df702-9c2a-43c8-bfa2-d681e5f0286b-kube-api-access-2d6h5\") pod \"nova-cell1-cell-mapping-rhzdd\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:24 crc kubenswrapper[4697]: I0220 16:54:24.974321 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-config-data\") pod \"nova-cell1-cell-mapping-rhzdd\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:25 crc kubenswrapper[4697]: I0220 16:54:25.075567 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rhzdd\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:25 crc kubenswrapper[4697]: I0220 16:54:25.075664 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-scripts\") pod \"nova-cell1-cell-mapping-rhzdd\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:25 crc kubenswrapper[4697]: I0220 16:54:25.075696 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d6h5\" (UniqueName: \"kubernetes.io/projected/e29df702-9c2a-43c8-bfa2-d681e5f0286b-kube-api-access-2d6h5\") pod \"nova-cell1-cell-mapping-rhzdd\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:25 crc kubenswrapper[4697]: I0220 16:54:25.075746 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-config-data\") pod \"nova-cell1-cell-mapping-rhzdd\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:25 crc kubenswrapper[4697]: I0220 16:54:25.082294 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-config-data\") pod \"nova-cell1-cell-mapping-rhzdd\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:25 crc kubenswrapper[4697]: I0220 16:54:25.082816 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rhzdd\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:25 crc kubenswrapper[4697]: I0220 16:54:25.083195 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-scripts\") pod \"nova-cell1-cell-mapping-rhzdd\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:25 crc kubenswrapper[4697]: I0220 16:54:25.092339 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d6h5\" (UniqueName: \"kubernetes.io/projected/e29df702-9c2a-43c8-bfa2-d681e5f0286b-kube-api-access-2d6h5\") pod \"nova-cell1-cell-mapping-rhzdd\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:25 crc kubenswrapper[4697]: I0220 16:54:25.185369 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:25 crc kubenswrapper[4697]: W0220 16:54:25.637177 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode29df702_9c2a_43c8_bfa2_d681e5f0286b.slice/crio-87e05642c4409010508694623011e71f6f3ca23440535e988adea5c4a118e222 WatchSource:0}: Error finding container 87e05642c4409010508694623011e71f6f3ca23440535e988adea5c4a118e222: Status 404 returned error can't find the container with id 87e05642c4409010508694623011e71f6f3ca23440535e988adea5c4a118e222 Feb 20 16:54:25 crc kubenswrapper[4697]: I0220 16:54:25.639098 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rhzdd"] Feb 20 16:54:25 crc kubenswrapper[4697]: I0220 16:54:25.705972 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rhzdd" event={"ID":"e29df702-9c2a-43c8-bfa2-d681e5f0286b","Type":"ContainerStarted","Data":"87e05642c4409010508694623011e71f6f3ca23440535e988adea5c4a118e222"} Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.109588 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.162937 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cdd54648c-vgxw7"] Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.163169 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" podUID="f858e9b3-afae-4bbd-987e-9f22fe25270c" containerName="dnsmasq-dns" containerID="cri-o://ff3885dc951d6b4c46150938d38477867fb56f3443262b2fdc9b8448f5e33595" gracePeriod=10 Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.736554 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rhzdd" event={"ID":"e29df702-9c2a-43c8-bfa2-d681e5f0286b","Type":"ContainerStarted","Data":"c399bc50be840adc92925f1fc55368c82f4b637e286e9cdec9a2cb7fe50d227c"} Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.740671 4697 generic.go:334] "Generic (PLEG): container finished" podID="f858e9b3-afae-4bbd-987e-9f22fe25270c" containerID="ff3885dc951d6b4c46150938d38477867fb56f3443262b2fdc9b8448f5e33595" exitCode=0 Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.740938 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" event={"ID":"f858e9b3-afae-4bbd-987e-9f22fe25270c","Type":"ContainerDied","Data":"ff3885dc951d6b4c46150938d38477867fb56f3443262b2fdc9b8448f5e33595"} Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.740988 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" event={"ID":"f858e9b3-afae-4bbd-987e-9f22fe25270c","Type":"ContainerDied","Data":"502392984d5366bccef8b31783b484dd8701183aee67c7b2819423461d976e7c"} Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.741000 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="502392984d5366bccef8b31783b484dd8701183aee67c7b2819423461d976e7c" Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.763768 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-rhzdd" podStartSLOduration=2.76374406 podStartE2EDuration="2.76374406s" podCreationTimestamp="2026-02-20 16:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:54:26.760915722 +0000 UTC m=+1374.540961120" watchObservedRunningTime="2026-02-20 16:54:26.76374406 +0000 UTC m=+1374.543789468" Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.771570 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.836767 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-dns-swift-storage-0\") pod \"f858e9b3-afae-4bbd-987e-9f22fe25270c\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.836820 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-ovsdbserver-nb\") pod \"f858e9b3-afae-4bbd-987e-9f22fe25270c\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.836860 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-dns-svc\") pod \"f858e9b3-afae-4bbd-987e-9f22fe25270c\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.836940 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkg7q\" (UniqueName: \"kubernetes.io/projected/f858e9b3-afae-4bbd-987e-9f22fe25270c-kube-api-access-pkg7q\") pod \"f858e9b3-afae-4bbd-987e-9f22fe25270c\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.837017 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-ovsdbserver-sb\") pod \"f858e9b3-afae-4bbd-987e-9f22fe25270c\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.837070 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-config\") pod \"f858e9b3-afae-4bbd-987e-9f22fe25270c\" (UID: \"f858e9b3-afae-4bbd-987e-9f22fe25270c\") " Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.866469 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f858e9b3-afae-4bbd-987e-9f22fe25270c-kube-api-access-pkg7q" (OuterVolumeSpecName: "kube-api-access-pkg7q") pod "f858e9b3-afae-4bbd-987e-9f22fe25270c" (UID: "f858e9b3-afae-4bbd-987e-9f22fe25270c"). InnerVolumeSpecName "kube-api-access-pkg7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.925281 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f858e9b3-afae-4bbd-987e-9f22fe25270c" (UID: "f858e9b3-afae-4bbd-987e-9f22fe25270c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.925297 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f858e9b3-afae-4bbd-987e-9f22fe25270c" (UID: "f858e9b3-afae-4bbd-987e-9f22fe25270c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.939357 4697 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.939390 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.939399 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkg7q\" (UniqueName: \"kubernetes.io/projected/f858e9b3-afae-4bbd-987e-9f22fe25270c-kube-api-access-pkg7q\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.951896 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f858e9b3-afae-4bbd-987e-9f22fe25270c" (UID: "f858e9b3-afae-4bbd-987e-9f22fe25270c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:54:26 crc kubenswrapper[4697]: I0220 16:54:26.961920 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f858e9b3-afae-4bbd-987e-9f22fe25270c" (UID: "f858e9b3-afae-4bbd-987e-9f22fe25270c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:54:27 crc kubenswrapper[4697]: I0220 16:54:27.007858 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-config" (OuterVolumeSpecName: "config") pod "f858e9b3-afae-4bbd-987e-9f22fe25270c" (UID: "f858e9b3-afae-4bbd-987e-9f22fe25270c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:54:27 crc kubenswrapper[4697]: I0220 16:54:27.040915 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:27 crc kubenswrapper[4697]: I0220 16:54:27.040951 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:27 crc kubenswrapper[4697]: I0220 16:54:27.040961 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f858e9b3-afae-4bbd-987e-9f22fe25270c-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:27 crc kubenswrapper[4697]: I0220 16:54:27.748365 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cdd54648c-vgxw7" Feb 20 16:54:27 crc kubenswrapper[4697]: I0220 16:54:27.791186 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cdd54648c-vgxw7"] Feb 20 16:54:27 crc kubenswrapper[4697]: I0220 16:54:27.798387 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cdd54648c-vgxw7"] Feb 20 16:54:28 crc kubenswrapper[4697]: I0220 16:54:28.891113 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f858e9b3-afae-4bbd-987e-9f22fe25270c" path="/var/lib/kubelet/pods/f858e9b3-afae-4bbd-987e-9f22fe25270c/volumes" Feb 20 16:54:31 crc kubenswrapper[4697]: I0220 16:54:31.120419 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 16:54:31 crc kubenswrapper[4697]: I0220 16:54:31.123202 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 16:54:31 crc kubenswrapper[4697]: I0220 16:54:31.788749 4697 generic.go:334] "Generic (PLEG): container finished" podID="e29df702-9c2a-43c8-bfa2-d681e5f0286b" containerID="c399bc50be840adc92925f1fc55368c82f4b637e286e9cdec9a2cb7fe50d227c" exitCode=0 Feb 20 16:54:31 crc kubenswrapper[4697]: I0220 16:54:31.788822 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rhzdd" event={"ID":"e29df702-9c2a-43c8-bfa2-d681e5f0286b","Type":"ContainerDied","Data":"c399bc50be840adc92925f1fc55368c82f4b637e286e9cdec9a2cb7fe50d227c"} Feb 20 16:54:32 crc kubenswrapper[4697]: I0220 16:54:32.132577 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e498ea7e-2c61-4388-939b-b9565ee47a24" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 16:54:32 crc kubenswrapper[4697]: I0220 16:54:32.132581 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e498ea7e-2c61-4388-939b-b9565ee47a24" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.263201 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.338930 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-config-data\") pod \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.339681 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-scripts\") pod \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.339941 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-combined-ca-bundle\") pod \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.340103 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d6h5\" (UniqueName: \"kubernetes.io/projected/e29df702-9c2a-43c8-bfa2-d681e5f0286b-kube-api-access-2d6h5\") pod \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\" (UID: \"e29df702-9c2a-43c8-bfa2-d681e5f0286b\") " Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.347746 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-scripts" (OuterVolumeSpecName: "scripts") pod "e29df702-9c2a-43c8-bfa2-d681e5f0286b" (UID: "e29df702-9c2a-43c8-bfa2-d681e5f0286b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.366402 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29df702-9c2a-43c8-bfa2-d681e5f0286b-kube-api-access-2d6h5" (OuterVolumeSpecName: "kube-api-access-2d6h5") pod "e29df702-9c2a-43c8-bfa2-d681e5f0286b" (UID: "e29df702-9c2a-43c8-bfa2-d681e5f0286b"). InnerVolumeSpecName "kube-api-access-2d6h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.385084 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e29df702-9c2a-43c8-bfa2-d681e5f0286b" (UID: "e29df702-9c2a-43c8-bfa2-d681e5f0286b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.392601 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-config-data" (OuterVolumeSpecName: "config-data") pod "e29df702-9c2a-43c8-bfa2-d681e5f0286b" (UID: "e29df702-9c2a-43c8-bfa2-d681e5f0286b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.442920 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d6h5\" (UniqueName: \"kubernetes.io/projected/e29df702-9c2a-43c8-bfa2-d681e5f0286b-kube-api-access-2d6h5\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.442956 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.442966 4697 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.442978 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29df702-9c2a-43c8-bfa2-d681e5f0286b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.808043 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rhzdd" event={"ID":"e29df702-9c2a-43c8-bfa2-d681e5f0286b","Type":"ContainerDied","Data":"87e05642c4409010508694623011e71f6f3ca23440535e988adea5c4a118e222"} Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.808089 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87e05642c4409010508694623011e71f6f3ca23440535e988adea5c4a118e222" Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.808085 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rhzdd" Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.931043 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.931285 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e498ea7e-2c61-4388-939b-b9565ee47a24" containerName="nova-api-log" containerID="cri-o://e4ad22f2ab20c38be8cbc9642c40100cf2ac496b60803b057247012223be0ed3" gracePeriod=30 Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.931365 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e498ea7e-2c61-4388-939b-b9565ee47a24" containerName="nova-api-api" containerID="cri-o://7cfd74243b0a1a8d7fffdf3745674c37af535baf18055296d3eb6dcb6033f36a" gracePeriod=30 Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.950735 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:54:33 crc kubenswrapper[4697]: I0220 16:54:33.951005 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="709f19f5-be99-41dd-9d8c-74dc997ee184" containerName="nova-scheduler-scheduler" containerID="cri-o://66f5ba3111da8f26766fceaff09ab9017e64e506cbddb324566e751993f84279" gracePeriod=30 Feb 20 16:54:34 crc kubenswrapper[4697]: I0220 16:54:34.011034 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:54:34 crc kubenswrapper[4697]: I0220 16:54:34.011297 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b69887b6-1e06-4500-b650-1fb06bed56c7" containerName="nova-metadata-log" containerID="cri-o://a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41" gracePeriod=30 Feb 20 16:54:34 crc kubenswrapper[4697]: I0220 16:54:34.011403 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b69887b6-1e06-4500-b650-1fb06bed56c7" containerName="nova-metadata-metadata" containerID="cri-o://74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a" gracePeriod=30 Feb 20 16:54:34 crc kubenswrapper[4697]: I0220 16:54:34.819524 4697 generic.go:334] "Generic (PLEG): container finished" podID="e498ea7e-2c61-4388-939b-b9565ee47a24" containerID="e4ad22f2ab20c38be8cbc9642c40100cf2ac496b60803b057247012223be0ed3" exitCode=143 Feb 20 16:54:34 crc kubenswrapper[4697]: I0220 16:54:34.819872 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e498ea7e-2c61-4388-939b-b9565ee47a24","Type":"ContainerDied","Data":"e4ad22f2ab20c38be8cbc9642c40100cf2ac496b60803b057247012223be0ed3"} Feb 20 16:54:34 crc kubenswrapper[4697]: I0220 16:54:34.821552 4697 generic.go:334] "Generic (PLEG): container finished" podID="b69887b6-1e06-4500-b650-1fb06bed56c7" containerID="a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41" exitCode=143 Feb 20 16:54:34 crc kubenswrapper[4697]: I0220 16:54:34.821630 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b69887b6-1e06-4500-b650-1fb06bed56c7","Type":"ContainerDied","Data":"a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41"} Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.298808 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.377857 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-combined-ca-bundle\") pod \"b69887b6-1e06-4500-b650-1fb06bed56c7\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.378637 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-config-data\") pod \"b69887b6-1e06-4500-b650-1fb06bed56c7\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.379063 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t24c\" (UniqueName: \"kubernetes.io/projected/b69887b6-1e06-4500-b650-1fb06bed56c7-kube-api-access-7t24c\") pod \"b69887b6-1e06-4500-b650-1fb06bed56c7\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.379169 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-nova-metadata-tls-certs\") pod \"b69887b6-1e06-4500-b650-1fb06bed56c7\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.379254 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69887b6-1e06-4500-b650-1fb06bed56c7-logs\") pod \"b69887b6-1e06-4500-b650-1fb06bed56c7\" (UID: \"b69887b6-1e06-4500-b650-1fb06bed56c7\") " Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.380576 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b69887b6-1e06-4500-b650-1fb06bed56c7-logs" (OuterVolumeSpecName: "logs") pod "b69887b6-1e06-4500-b650-1fb06bed56c7" (UID: "b69887b6-1e06-4500-b650-1fb06bed56c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.385070 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b69887b6-1e06-4500-b650-1fb06bed56c7-kube-api-access-7t24c" (OuterVolumeSpecName: "kube-api-access-7t24c") pod "b69887b6-1e06-4500-b650-1fb06bed56c7" (UID: "b69887b6-1e06-4500-b650-1fb06bed56c7"). InnerVolumeSpecName "kube-api-access-7t24c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.438570 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-config-data" (OuterVolumeSpecName: "config-data") pod "b69887b6-1e06-4500-b650-1fb06bed56c7" (UID: "b69887b6-1e06-4500-b650-1fb06bed56c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.439314 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b69887b6-1e06-4500-b650-1fb06bed56c7" (UID: "b69887b6-1e06-4500-b650-1fb06bed56c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.451631 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b69887b6-1e06-4500-b650-1fb06bed56c7" (UID: "b69887b6-1e06-4500-b650-1fb06bed56c7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.482896 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.482928 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t24c\" (UniqueName: \"kubernetes.io/projected/b69887b6-1e06-4500-b650-1fb06bed56c7-kube-api-access-7t24c\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.482939 4697 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.482949 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69887b6-1e06-4500-b650-1fb06bed56c7-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.482959 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69887b6-1e06-4500-b650-1fb06bed56c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.833322 4697 generic.go:334] "Generic (PLEG): container finished" podID="b69887b6-1e06-4500-b650-1fb06bed56c7" containerID="74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a" exitCode=0 Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.834179 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b69887b6-1e06-4500-b650-1fb06bed56c7","Type":"ContainerDied","Data":"74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a"} Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.834218 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b69887b6-1e06-4500-b650-1fb06bed56c7","Type":"ContainerDied","Data":"7d4da44a36ffa2669e9eff0879eaf415e5308a11d4f18405eb3309e3e5bbef4a"} Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.834236 4697 scope.go:117] "RemoveContainer" containerID="74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.834365 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.855221 4697 generic.go:334] "Generic (PLEG): container finished" podID="709f19f5-be99-41dd-9d8c-74dc997ee184" containerID="66f5ba3111da8f26766fceaff09ab9017e64e506cbddb324566e751993f84279" exitCode=0 Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.855275 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"709f19f5-be99-41dd-9d8c-74dc997ee184","Type":"ContainerDied","Data":"66f5ba3111da8f26766fceaff09ab9017e64e506cbddb324566e751993f84279"} Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.937538 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.947210 4697 scope.go:117] "RemoveContainer" containerID="a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.977510 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.993530 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:54:35 crc kubenswrapper[4697]: E0220 16:54:35.993952 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69887b6-1e06-4500-b650-1fb06bed56c7" containerName="nova-metadata-log" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.993967 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69887b6-1e06-4500-b650-1fb06bed56c7" containerName="nova-metadata-log" Feb 20 16:54:35 crc kubenswrapper[4697]: E0220 16:54:35.993995 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f858e9b3-afae-4bbd-987e-9f22fe25270c" containerName="dnsmasq-dns" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.994002 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f858e9b3-afae-4bbd-987e-9f22fe25270c" containerName="dnsmasq-dns" Feb 20 16:54:35 crc kubenswrapper[4697]: E0220 16:54:35.994024 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69887b6-1e06-4500-b650-1fb06bed56c7" containerName="nova-metadata-metadata" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.994029 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69887b6-1e06-4500-b650-1fb06bed56c7" containerName="nova-metadata-metadata" Feb 20 16:54:35 crc kubenswrapper[4697]: E0220 16:54:35.994037 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f858e9b3-afae-4bbd-987e-9f22fe25270c" containerName="init" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.994043 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f858e9b3-afae-4bbd-987e-9f22fe25270c" containerName="init" Feb 20 16:54:35 crc kubenswrapper[4697]: E0220 16:54:35.994051 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29df702-9c2a-43c8-bfa2-d681e5f0286b" containerName="nova-manage" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.994057 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29df702-9c2a-43c8-bfa2-d681e5f0286b" containerName="nova-manage" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.994224 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b69887b6-1e06-4500-b650-1fb06bed56c7" containerName="nova-metadata-metadata" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.994242 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f858e9b3-afae-4bbd-987e-9f22fe25270c" containerName="dnsmasq-dns" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.994253 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b69887b6-1e06-4500-b650-1fb06bed56c7" containerName="nova-metadata-log" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.994264 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29df702-9c2a-43c8-bfa2-d681e5f0286b" containerName="nova-manage" Feb 20 16:54:35 crc kubenswrapper[4697]: I0220 16:54:35.995490 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.002896 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.003311 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.027907 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.053409 4697 scope.go:117] "RemoveContainer" containerID="74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a" Feb 20 16:54:36 crc kubenswrapper[4697]: E0220 16:54:36.054588 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a\": container with ID starting with 74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a not found: ID does not exist" containerID="74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.054713 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a"} err="failed to get container status \"74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a\": rpc error: code = NotFound desc = could not find container \"74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a\": container with ID starting with 74ed784c56dacdc42c4d476bbe046a731a430444a68097bd34210026458cef5a not found: ID does not exist" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.054806 4697 scope.go:117] "RemoveContainer" containerID="a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41" Feb 20 16:54:36 crc kubenswrapper[4697]: E0220 16:54:36.058830 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41\": container with ID starting with a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41 not found: ID does not exist" containerID="a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.059016 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41"} err="failed to get container status \"a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41\": rpc error: code = NotFound desc = could not find container \"a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41\": container with ID starting with a47807be788591eca102f26c1b2c93c09fa1ccde389512f3fe0ccb0073357c41 not found: ID does not exist" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.106603 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-965kc\" (UniqueName: \"kubernetes.io/projected/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-kube-api-access-965kc\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.106677 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.106700 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-config-data\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.106770 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-logs\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.106797 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.174485 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.208857 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-config-data\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.208965 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-logs\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.208998 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.209061 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-965kc\" (UniqueName: \"kubernetes.io/projected/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-kube-api-access-965kc\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.209121 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.209834 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-logs\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.214411 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.216123 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-config-data\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.220168 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.228443 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-965kc\" (UniqueName: \"kubernetes.io/projected/5f42a083-98ed-403d-90ca-cc4ef5ba79d1-kube-api-access-965kc\") pod \"nova-metadata-0\" (UID: \"5f42a083-98ed-403d-90ca-cc4ef5ba79d1\") " pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.309934 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqvnp\" (UniqueName: \"kubernetes.io/projected/709f19f5-be99-41dd-9d8c-74dc997ee184-kube-api-access-bqvnp\") pod \"709f19f5-be99-41dd-9d8c-74dc997ee184\" (UID: \"709f19f5-be99-41dd-9d8c-74dc997ee184\") " Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.310069 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f19f5-be99-41dd-9d8c-74dc997ee184-combined-ca-bundle\") pod \"709f19f5-be99-41dd-9d8c-74dc997ee184\" (UID: \"709f19f5-be99-41dd-9d8c-74dc997ee184\") " Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.310109 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709f19f5-be99-41dd-9d8c-74dc997ee184-config-data\") pod \"709f19f5-be99-41dd-9d8c-74dc997ee184\" (UID: \"709f19f5-be99-41dd-9d8c-74dc997ee184\") " Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.313693 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709f19f5-be99-41dd-9d8c-74dc997ee184-kube-api-access-bqvnp" (OuterVolumeSpecName: "kube-api-access-bqvnp") pod "709f19f5-be99-41dd-9d8c-74dc997ee184" (UID: "709f19f5-be99-41dd-9d8c-74dc997ee184"). InnerVolumeSpecName "kube-api-access-bqvnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.329142 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.338530 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f19f5-be99-41dd-9d8c-74dc997ee184-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "709f19f5-be99-41dd-9d8c-74dc997ee184" (UID: "709f19f5-be99-41dd-9d8c-74dc997ee184"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.350887 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f19f5-be99-41dd-9d8c-74dc997ee184-config-data" (OuterVolumeSpecName: "config-data") pod "709f19f5-be99-41dd-9d8c-74dc997ee184" (UID: "709f19f5-be99-41dd-9d8c-74dc997ee184"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.413055 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqvnp\" (UniqueName: \"kubernetes.io/projected/709f19f5-be99-41dd-9d8c-74dc997ee184-kube-api-access-bqvnp\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.413083 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f19f5-be99-41dd-9d8c-74dc997ee184-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.413092 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709f19f5-be99-41dd-9d8c-74dc997ee184-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.755411 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 16:54:36 crc kubenswrapper[4697]: W0220 16:54:36.766240 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f42a083_98ed_403d_90ca_cc4ef5ba79d1.slice/crio-5ed0995a08c4b62e8d8c792dfcffffc5f932f3fcd5ca4f13737700b96824f269 WatchSource:0}: Error finding container 5ed0995a08c4b62e8d8c792dfcffffc5f932f3fcd5ca4f13737700b96824f269: Status 404 returned error can't find the container with id 5ed0995a08c4b62e8d8c792dfcffffc5f932f3fcd5ca4f13737700b96824f269 Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.869341 4697 generic.go:334] "Generic (PLEG): container finished" podID="e498ea7e-2c61-4388-939b-b9565ee47a24" containerID="7cfd74243b0a1a8d7fffdf3745674c37af535baf18055296d3eb6dcb6033f36a" exitCode=0 Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.869405 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e498ea7e-2c61-4388-939b-b9565ee47a24","Type":"ContainerDied","Data":"7cfd74243b0a1a8d7fffdf3745674c37af535baf18055296d3eb6dcb6033f36a"} Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.873570 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"709f19f5-be99-41dd-9d8c-74dc997ee184","Type":"ContainerDied","Data":"f0c93d0c857472be32593ecf96c0b350803f4dff6a28c14e1bbde99c8847cdef"} Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.873612 4697 scope.go:117] "RemoveContainer" containerID="66f5ba3111da8f26766fceaff09ab9017e64e506cbddb324566e751993f84279" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.873687 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.890776 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b69887b6-1e06-4500-b650-1fb06bed56c7" path="/var/lib/kubelet/pods/b69887b6-1e06-4500-b650-1fb06bed56c7/volumes" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.891506 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f42a083-98ed-403d-90ca-cc4ef5ba79d1","Type":"ContainerStarted","Data":"5ed0995a08c4b62e8d8c792dfcffffc5f932f3fcd5ca4f13737700b96824f269"} Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.937421 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.966871 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.976376 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:54:36 crc kubenswrapper[4697]: E0220 16:54:36.976815 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709f19f5-be99-41dd-9d8c-74dc997ee184" containerName="nova-scheduler-scheduler" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.976828 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="709f19f5-be99-41dd-9d8c-74dc997ee184" containerName="nova-scheduler-scheduler" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.977031 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="709f19f5-be99-41dd-9d8c-74dc997ee184" containerName="nova-scheduler-scheduler" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.977828 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.980060 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 16:54:36 crc kubenswrapper[4697]: I0220 16:54:36.985507 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.128341 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bcd6e42-c512-4076-926f-b64c1da63a8a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9bcd6e42-c512-4076-926f-b64c1da63a8a\") " pod="openstack/nova-scheduler-0" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.128472 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nwp6\" (UniqueName: \"kubernetes.io/projected/9bcd6e42-c512-4076-926f-b64c1da63a8a-kube-api-access-9nwp6\") pod \"nova-scheduler-0\" (UID: \"9bcd6e42-c512-4076-926f-b64c1da63a8a\") " pod="openstack/nova-scheduler-0" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.128537 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bcd6e42-c512-4076-926f-b64c1da63a8a-config-data\") pod \"nova-scheduler-0\" (UID: \"9bcd6e42-c512-4076-926f-b64c1da63a8a\") " pod="openstack/nova-scheduler-0" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.138265 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.229619 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-public-tls-certs\") pod \"e498ea7e-2c61-4388-939b-b9565ee47a24\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.229678 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-combined-ca-bundle\") pod \"e498ea7e-2c61-4388-939b-b9565ee47a24\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.229713 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qjqk\" (UniqueName: \"kubernetes.io/projected/e498ea7e-2c61-4388-939b-b9565ee47a24-kube-api-access-6qjqk\") pod \"e498ea7e-2c61-4388-939b-b9565ee47a24\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.229757 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-config-data\") pod \"e498ea7e-2c61-4388-939b-b9565ee47a24\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.229776 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-internal-tls-certs\") pod \"e498ea7e-2c61-4388-939b-b9565ee47a24\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.229886 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e498ea7e-2c61-4388-939b-b9565ee47a24-logs\") pod \"e498ea7e-2c61-4388-939b-b9565ee47a24\" (UID: \"e498ea7e-2c61-4388-939b-b9565ee47a24\") " Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.230195 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nwp6\" (UniqueName: \"kubernetes.io/projected/9bcd6e42-c512-4076-926f-b64c1da63a8a-kube-api-access-9nwp6\") pod \"nova-scheduler-0\" (UID: \"9bcd6e42-c512-4076-926f-b64c1da63a8a\") " pod="openstack/nova-scheduler-0" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.230263 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bcd6e42-c512-4076-926f-b64c1da63a8a-config-data\") pod \"nova-scheduler-0\" (UID: \"9bcd6e42-c512-4076-926f-b64c1da63a8a\") " pod="openstack/nova-scheduler-0" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.230345 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bcd6e42-c512-4076-926f-b64c1da63a8a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9bcd6e42-c512-4076-926f-b64c1da63a8a\") " pod="openstack/nova-scheduler-0" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.235571 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bcd6e42-c512-4076-926f-b64c1da63a8a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9bcd6e42-c512-4076-926f-b64c1da63a8a\") " pod="openstack/nova-scheduler-0" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.240848 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e498ea7e-2c61-4388-939b-b9565ee47a24-kube-api-access-6qjqk" (OuterVolumeSpecName: "kube-api-access-6qjqk") pod "e498ea7e-2c61-4388-939b-b9565ee47a24" (UID: "e498ea7e-2c61-4388-939b-b9565ee47a24"). InnerVolumeSpecName "kube-api-access-6qjqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.241509 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e498ea7e-2c61-4388-939b-b9565ee47a24-logs" (OuterVolumeSpecName: "logs") pod "e498ea7e-2c61-4388-939b-b9565ee47a24" (UID: "e498ea7e-2c61-4388-939b-b9565ee47a24"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.241884 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bcd6e42-c512-4076-926f-b64c1da63a8a-config-data\") pod \"nova-scheduler-0\" (UID: \"9bcd6e42-c512-4076-926f-b64c1da63a8a\") " pod="openstack/nova-scheduler-0" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.260587 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nwp6\" (UniqueName: \"kubernetes.io/projected/9bcd6e42-c512-4076-926f-b64c1da63a8a-kube-api-access-9nwp6\") pod \"nova-scheduler-0\" (UID: \"9bcd6e42-c512-4076-926f-b64c1da63a8a\") " pod="openstack/nova-scheduler-0" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.269675 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e498ea7e-2c61-4388-939b-b9565ee47a24" (UID: "e498ea7e-2c61-4388-939b-b9565ee47a24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.274653 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-config-data" (OuterVolumeSpecName: "config-data") pod "e498ea7e-2c61-4388-939b-b9565ee47a24" (UID: "e498ea7e-2c61-4388-939b-b9565ee47a24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.293157 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e498ea7e-2c61-4388-939b-b9565ee47a24" (UID: "e498ea7e-2c61-4388-939b-b9565ee47a24"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.294058 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e498ea7e-2c61-4388-939b-b9565ee47a24" (UID: "e498ea7e-2c61-4388-939b-b9565ee47a24"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.302866 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.332570 4697 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e498ea7e-2c61-4388-939b-b9565ee47a24-logs\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.332604 4697 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.332615 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.332626 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qjqk\" (UniqueName: \"kubernetes.io/projected/e498ea7e-2c61-4388-939b-b9565ee47a24-kube-api-access-6qjqk\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.332635 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.332644 4697 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e498ea7e-2c61-4388-939b-b9565ee47a24-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.746365 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 16:54:37 crc kubenswrapper[4697]: W0220 16:54:37.753674 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bcd6e42_c512_4076_926f_b64c1da63a8a.slice/crio-75a18711cd5e70ab24015b854693ef02cf619fe93893b4482500149099c8a027 WatchSource:0}: Error finding container 75a18711cd5e70ab24015b854693ef02cf619fe93893b4482500149099c8a027: Status 404 returned error can't find the container with id 75a18711cd5e70ab24015b854693ef02cf619fe93893b4482500149099c8a027 Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.888474 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f42a083-98ed-403d-90ca-cc4ef5ba79d1","Type":"ContainerStarted","Data":"6030fb8776bb8f1195d902ac68cbbd0b1c1ff4ded9fe42ce4b831822fafb94e8"} Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.888831 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f42a083-98ed-403d-90ca-cc4ef5ba79d1","Type":"ContainerStarted","Data":"4343fe7fd0d5f10397a6bf595e1761dfadd5a5b6d713e28ae31578022432440b"} Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.891888 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9bcd6e42-c512-4076-926f-b64c1da63a8a","Type":"ContainerStarted","Data":"75a18711cd5e70ab24015b854693ef02cf619fe93893b4482500149099c8a027"} Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.894650 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e498ea7e-2c61-4388-939b-b9565ee47a24","Type":"ContainerDied","Data":"503a74521c9c232fdb30a9b8f214551a7d4fd117de6c88bb2bd8ca92b4adb816"} Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.894706 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.894728 4697 scope.go:117] "RemoveContainer" containerID="7cfd74243b0a1a8d7fffdf3745674c37af535baf18055296d3eb6dcb6033f36a" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.910189 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.910170793 podStartE2EDuration="2.910170793s" podCreationTimestamp="2026-02-20 16:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:54:37.907495388 +0000 UTC m=+1385.687540806" watchObservedRunningTime="2026-02-20 16:54:37.910170793 +0000 UTC m=+1385.690216201" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.923567 4697 scope.go:117] "RemoveContainer" containerID="e4ad22f2ab20c38be8cbc9642c40100cf2ac496b60803b057247012223be0ed3" Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.950381 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:54:37 crc kubenswrapper[4697]: I0220 16:54:37.974595 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.001894 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 16:54:38 crc kubenswrapper[4697]: E0220 16:54:38.002276 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e498ea7e-2c61-4388-939b-b9565ee47a24" containerName="nova-api-api" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.002291 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e498ea7e-2c61-4388-939b-b9565ee47a24" containerName="nova-api-api" Feb 20 16:54:38 crc kubenswrapper[4697]: E0220 16:54:38.002312 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e498ea7e-2c61-4388-939b-b9565ee47a24" containerName="nova-api-log" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.002318 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e498ea7e-2c61-4388-939b-b9565ee47a24" containerName="nova-api-log" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.002568 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e498ea7e-2c61-4388-939b-b9565ee47a24" containerName="nova-api-api" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.002596 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e498ea7e-2c61-4388-939b-b9565ee47a24" containerName="nova-api-log" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.003660 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.018044 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.018125 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.018176 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.036257 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.148287 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947447ab-bffc-4330-9983-38789a4e8fcc-config-data\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.148355 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947447ab-bffc-4330-9983-38789a4e8fcc-logs\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.148394 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947447ab-bffc-4330-9983-38789a4e8fcc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.148543 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947447ab-bffc-4330-9983-38789a4e8fcc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.148657 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947447ab-bffc-4330-9983-38789a4e8fcc-public-tls-certs\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.148736 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jflft\" (UniqueName: \"kubernetes.io/projected/947447ab-bffc-4330-9983-38789a4e8fcc-kube-api-access-jflft\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.250584 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947447ab-bffc-4330-9983-38789a4e8fcc-config-data\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.250674 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947447ab-bffc-4330-9983-38789a4e8fcc-logs\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.250715 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947447ab-bffc-4330-9983-38789a4e8fcc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.250850 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947447ab-bffc-4330-9983-38789a4e8fcc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.250985 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947447ab-bffc-4330-9983-38789a4e8fcc-public-tls-certs\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.251027 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jflft\" (UniqueName: \"kubernetes.io/projected/947447ab-bffc-4330-9983-38789a4e8fcc-kube-api-access-jflft\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.252042 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947447ab-bffc-4330-9983-38789a4e8fcc-logs\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.258804 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947447ab-bffc-4330-9983-38789a4e8fcc-config-data\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.265159 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947447ab-bffc-4330-9983-38789a4e8fcc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.266185 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947447ab-bffc-4330-9983-38789a4e8fcc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.272755 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947447ab-bffc-4330-9983-38789a4e8fcc-public-tls-certs\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.277321 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jflft\" (UniqueName: \"kubernetes.io/projected/947447ab-bffc-4330-9983-38789a4e8fcc-kube-api-access-jflft\") pod \"nova-api-0\" (UID: \"947447ab-bffc-4330-9983-38789a4e8fcc\") " pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.335418 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.770116 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 16:54:38 crc kubenswrapper[4697]: W0220 16:54:38.784188 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod947447ab_bffc_4330_9983_38789a4e8fcc.slice/crio-7ec95409211c9b711549cc9c82cdc169dba884b0518b89201e5587614b40bf9d WatchSource:0}: Error finding container 7ec95409211c9b711549cc9c82cdc169dba884b0518b89201e5587614b40bf9d: Status 404 returned error can't find the container with id 7ec95409211c9b711549cc9c82cdc169dba884b0518b89201e5587614b40bf9d Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.897417 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="709f19f5-be99-41dd-9d8c-74dc997ee184" path="/var/lib/kubelet/pods/709f19f5-be99-41dd-9d8c-74dc997ee184/volumes" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.899493 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e498ea7e-2c61-4388-939b-b9565ee47a24" path="/var/lib/kubelet/pods/e498ea7e-2c61-4388-939b-b9565ee47a24/volumes" Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.910772 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"947447ab-bffc-4330-9983-38789a4e8fcc","Type":"ContainerStarted","Data":"7ec95409211c9b711549cc9c82cdc169dba884b0518b89201e5587614b40bf9d"} Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.912923 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9bcd6e42-c512-4076-926f-b64c1da63a8a","Type":"ContainerStarted","Data":"eb57bb733f79893c998081e4d8fc52f079bf96056f13d38f25b74fec548b6688"} Feb 20 16:54:38 crc kubenswrapper[4697]: I0220 16:54:38.931454 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.931424092 podStartE2EDuration="2.931424092s" podCreationTimestamp="2026-02-20 16:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:54:38.929814373 +0000 UTC m=+1386.709859781" watchObservedRunningTime="2026-02-20 16:54:38.931424092 +0000 UTC m=+1386.711469500" Feb 20 16:54:39 crc kubenswrapper[4697]: I0220 16:54:39.924422 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"947447ab-bffc-4330-9983-38789a4e8fcc","Type":"ContainerStarted","Data":"532089be27d2e49fd901c057c2e3a31bbe5c3a6bec743c650f6361fcc2b96a2e"} Feb 20 16:54:39 crc kubenswrapper[4697]: I0220 16:54:39.924793 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"947447ab-bffc-4330-9983-38789a4e8fcc","Type":"ContainerStarted","Data":"3d6e56718191f14659d75fb95d5cac343d3b1645686e02ffb905962ac895d35a"} Feb 20 16:54:39 crc kubenswrapper[4697]: I0220 16:54:39.955313 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.955288034 podStartE2EDuration="2.955288034s" podCreationTimestamp="2026-02-20 16:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:54:39.946363608 +0000 UTC m=+1387.726409036" watchObservedRunningTime="2026-02-20 16:54:39.955288034 +0000 UTC m=+1387.735333442" Feb 20 16:54:41 crc kubenswrapper[4697]: I0220 16:54:41.330277 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 16:54:41 crc kubenswrapper[4697]: I0220 16:54:41.330329 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 16:54:42 crc kubenswrapper[4697]: I0220 16:54:42.303474 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 16:54:46 crc kubenswrapper[4697]: I0220 16:54:46.330460 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 16:54:46 crc kubenswrapper[4697]: I0220 16:54:46.330957 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 16:54:47 crc kubenswrapper[4697]: I0220 16:54:47.303336 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 16:54:47 crc kubenswrapper[4697]: I0220 16:54:47.336028 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 16:54:47 crc kubenswrapper[4697]: I0220 16:54:47.345598 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5f42a083-98ed-403d-90ca-cc4ef5ba79d1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.233:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 16:54:47 crc kubenswrapper[4697]: I0220 16:54:47.345619 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5f42a083-98ed-403d-90ca-cc4ef5ba79d1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.233:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 16:54:48 crc kubenswrapper[4697]: I0220 16:54:48.036495 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 16:54:48 crc kubenswrapper[4697]: I0220 16:54:48.336415 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 16:54:48 crc kubenswrapper[4697]: I0220 16:54:48.336769 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 16:54:49 crc kubenswrapper[4697]: I0220 16:54:49.345665 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="947447ab-bffc-4330-9983-38789a4e8fcc" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 16:54:49 crc kubenswrapper[4697]: I0220 16:54:49.355647 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="947447ab-bffc-4330-9983-38789a4e8fcc" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 16:54:50 crc kubenswrapper[4697]: I0220 16:54:50.123974 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 16:54:56 crc kubenswrapper[4697]: I0220 16:54:56.337342 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 16:54:56 crc kubenswrapper[4697]: I0220 16:54:56.339222 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 16:54:56 crc kubenswrapper[4697]: I0220 16:54:56.346612 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 16:54:57 crc kubenswrapper[4697]: I0220 16:54:57.088732 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 16:54:58 crc kubenswrapper[4697]: I0220 16:54:58.347597 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 16:54:58 crc kubenswrapper[4697]: I0220 16:54:58.348671 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 16:54:58 crc kubenswrapper[4697]: I0220 16:54:58.350185 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 16:54:58 crc kubenswrapper[4697]: I0220 16:54:58.356021 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 16:54:59 crc kubenswrapper[4697]: I0220 16:54:59.101283 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 16:54:59 crc kubenswrapper[4697]: I0220 16:54:59.112526 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 16:55:07 crc kubenswrapper[4697]: I0220 16:55:07.044192 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 16:55:07 crc kubenswrapper[4697]: I0220 16:55:07.972954 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 16:55:10 crc kubenswrapper[4697]: I0220 16:55:10.582899 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5224cc9f-d610-4ea0-94da-11cdb019dcce" containerName="rabbitmq" containerID="cri-o://b97e258cdd1189fb417c1ab6d7c7def29ec543522f72b6a334728b99dc9911f0" gracePeriod=604797 Feb 20 16:55:11 crc kubenswrapper[4697]: I0220 16:55:11.458913 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="40ca67b4-1eb6-40a6-ad33-1982ed83eb63" containerName="rabbitmq" containerID="cri-o://55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf" gracePeriod=604797 Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.235131 4697 generic.go:334] "Generic (PLEG): container finished" podID="5224cc9f-d610-4ea0-94da-11cdb019dcce" containerID="b97e258cdd1189fb417c1ab6d7c7def29ec543522f72b6a334728b99dc9911f0" exitCode=0 Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.235474 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5224cc9f-d610-4ea0-94da-11cdb019dcce","Type":"ContainerDied","Data":"b97e258cdd1189fb417c1ab6d7c7def29ec543522f72b6a334728b99dc9911f0"} Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.308340 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.425503 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-confd\") pod \"5224cc9f-d610-4ea0-94da-11cdb019dcce\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.425666 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-erlang-cookie\") pod \"5224cc9f-d610-4ea0-94da-11cdb019dcce\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.425701 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5224cc9f-d610-4ea0-94da-11cdb019dcce-pod-info\") pod \"5224cc9f-d610-4ea0-94da-11cdb019dcce\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.425750 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5224cc9f-d610-4ea0-94da-11cdb019dcce-erlang-cookie-secret\") pod \"5224cc9f-d610-4ea0-94da-11cdb019dcce\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.425788 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jvlw\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-kube-api-access-4jvlw\") pod \"5224cc9f-d610-4ea0-94da-11cdb019dcce\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.425820 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-plugins\") pod \"5224cc9f-d610-4ea0-94da-11cdb019dcce\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.425930 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-tls\") pod \"5224cc9f-d610-4ea0-94da-11cdb019dcce\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.426008 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-config-data\") pod \"5224cc9f-d610-4ea0-94da-11cdb019dcce\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.426039 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-plugins-conf\") pod \"5224cc9f-d610-4ea0-94da-11cdb019dcce\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.426064 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-server-conf\") pod \"5224cc9f-d610-4ea0-94da-11cdb019dcce\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.426149 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"5224cc9f-d610-4ea0-94da-11cdb019dcce\" (UID: \"5224cc9f-d610-4ea0-94da-11cdb019dcce\") " Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.430084 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5224cc9f-d610-4ea0-94da-11cdb019dcce" (UID: "5224cc9f-d610-4ea0-94da-11cdb019dcce"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.431426 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5224cc9f-d610-4ea0-94da-11cdb019dcce" (UID: "5224cc9f-d610-4ea0-94da-11cdb019dcce"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.442031 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5224cc9f-d610-4ea0-94da-11cdb019dcce-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5224cc9f-d610-4ea0-94da-11cdb019dcce" (UID: "5224cc9f-d610-4ea0-94da-11cdb019dcce"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.442219 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5224cc9f-d610-4ea0-94da-11cdb019dcce" (UID: "5224cc9f-d610-4ea0-94da-11cdb019dcce"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.443564 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "5224cc9f-d610-4ea0-94da-11cdb019dcce" (UID: "5224cc9f-d610-4ea0-94da-11cdb019dcce"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.443607 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5224cc9f-d610-4ea0-94da-11cdb019dcce-pod-info" (OuterVolumeSpecName: "pod-info") pod "5224cc9f-d610-4ea0-94da-11cdb019dcce" (UID: "5224cc9f-d610-4ea0-94da-11cdb019dcce"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.459670 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-kube-api-access-4jvlw" (OuterVolumeSpecName: "kube-api-access-4jvlw") pod "5224cc9f-d610-4ea0-94da-11cdb019dcce" (UID: "5224cc9f-d610-4ea0-94da-11cdb019dcce"). InnerVolumeSpecName "kube-api-access-4jvlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.476639 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5224cc9f-d610-4ea0-94da-11cdb019dcce" (UID: "5224cc9f-d610-4ea0-94da-11cdb019dcce"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.488120 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-config-data" (OuterVolumeSpecName: "config-data") pod "5224cc9f-d610-4ea0-94da-11cdb019dcce" (UID: "5224cc9f-d610-4ea0-94da-11cdb019dcce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.526475 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-server-conf" (OuterVolumeSpecName: "server-conf") pod "5224cc9f-d610-4ea0-94da-11cdb019dcce" (UID: "5224cc9f-d610-4ea0-94da-11cdb019dcce"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.528601 4697 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.528633 4697 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.528646 4697 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5224cc9f-d610-4ea0-94da-11cdb019dcce-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.528657 4697 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5224cc9f-d610-4ea0-94da-11cdb019dcce-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.528665 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jvlw\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-kube-api-access-4jvlw\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.528674 4697 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.528682 4697 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.528689 4697 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.528697 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.528704 4697 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5224cc9f-d610-4ea0-94da-11cdb019dcce-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.559106 4697 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.600610 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5224cc9f-d610-4ea0-94da-11cdb019dcce" (UID: "5224cc9f-d610-4ea0-94da-11cdb019dcce"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.630884 4697 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:12 crc kubenswrapper[4697]: I0220 16:55:12.630919 4697 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5224cc9f-d610-4ea0-94da-11cdb019dcce-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.048662 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.142730 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-pod-info\") pod \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.142787 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjs6q\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-kube-api-access-bjs6q\") pod \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.142823 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-confd\") pod \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.142840 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-plugins\") pod \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.142856 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-config-data\") pod \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.142921 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.143006 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-erlang-cookie\") pod \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.143029 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-server-conf\") pod \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.143054 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-tls\") pod \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.143084 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-plugins-conf\") pod \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.143121 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-erlang-cookie-secret\") pod \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\" (UID: \"40ca67b4-1eb6-40a6-ad33-1982ed83eb63\") " Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.145448 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "40ca67b4-1eb6-40a6-ad33-1982ed83eb63" (UID: "40ca67b4-1eb6-40a6-ad33-1982ed83eb63"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.145723 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "40ca67b4-1eb6-40a6-ad33-1982ed83eb63" (UID: "40ca67b4-1eb6-40a6-ad33-1982ed83eb63"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.146023 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "40ca67b4-1eb6-40a6-ad33-1982ed83eb63" (UID: "40ca67b4-1eb6-40a6-ad33-1982ed83eb63"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.150360 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "40ca67b4-1eb6-40a6-ad33-1982ed83eb63" (UID: "40ca67b4-1eb6-40a6-ad33-1982ed83eb63"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.162480 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-pod-info" (OuterVolumeSpecName: "pod-info") pod "40ca67b4-1eb6-40a6-ad33-1982ed83eb63" (UID: "40ca67b4-1eb6-40a6-ad33-1982ed83eb63"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.164337 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "40ca67b4-1eb6-40a6-ad33-1982ed83eb63" (UID: "40ca67b4-1eb6-40a6-ad33-1982ed83eb63"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.167450 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "40ca67b4-1eb6-40a6-ad33-1982ed83eb63" (UID: "40ca67b4-1eb6-40a6-ad33-1982ed83eb63"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.174627 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-kube-api-access-bjs6q" (OuterVolumeSpecName: "kube-api-access-bjs6q") pod "40ca67b4-1eb6-40a6-ad33-1982ed83eb63" (UID: "40ca67b4-1eb6-40a6-ad33-1982ed83eb63"). InnerVolumeSpecName "kube-api-access-bjs6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.180041 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-config-data" (OuterVolumeSpecName: "config-data") pod "40ca67b4-1eb6-40a6-ad33-1982ed83eb63" (UID: "40ca67b4-1eb6-40a6-ad33-1982ed83eb63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.205917 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-server-conf" (OuterVolumeSpecName: "server-conf") pod "40ca67b4-1eb6-40a6-ad33-1982ed83eb63" (UID: "40ca67b4-1eb6-40a6-ad33-1982ed83eb63"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.245712 4697 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.246564 4697 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.246624 4697 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.246689 4697 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.246745 4697 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.246800 4697 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.246856 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjs6q\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-kube-api-access-bjs6q\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.246906 4697 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.246956 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.247023 4697 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.306703 4697 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.310607 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5224cc9f-d610-4ea0-94da-11cdb019dcce","Type":"ContainerDied","Data":"149f6e09cca44ea0851008e6e2eee087d7dc51b26598c0b4fe8bd8fc9e577edb"} Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.310649 4697 scope.go:117] "RemoveContainer" containerID="b97e258cdd1189fb417c1ab6d7c7def29ec543522f72b6a334728b99dc9911f0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.310791 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.331689 4697 generic.go:334] "Generic (PLEG): container finished" podID="40ca67b4-1eb6-40a6-ad33-1982ed83eb63" containerID="55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf" exitCode=0 Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.331729 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"40ca67b4-1eb6-40a6-ad33-1982ed83eb63","Type":"ContainerDied","Data":"55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf"} Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.331756 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"40ca67b4-1eb6-40a6-ad33-1982ed83eb63","Type":"ContainerDied","Data":"4031a5ea37ad4e319c7d2cb4d52b2422257aba8e507bbff9fafa3d8901f117bf"} Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.331831 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.348810 4697 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.356864 4697 scope.go:117] "RemoveContainer" containerID="3f293a48d38b7b781163e876ccf67bbca965ff9663c9453e12702b566695a6d3" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.360130 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "40ca67b4-1eb6-40a6-ad33-1982ed83eb63" (UID: "40ca67b4-1eb6-40a6-ad33-1982ed83eb63"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.374835 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.399647 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.404274 4697 scope.go:117] "RemoveContainer" containerID="55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.409911 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 16:55:13 crc kubenswrapper[4697]: E0220 16:55:13.410363 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ca67b4-1eb6-40a6-ad33-1982ed83eb63" containerName="rabbitmq" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.410382 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ca67b4-1eb6-40a6-ad33-1982ed83eb63" containerName="rabbitmq" Feb 20 16:55:13 crc kubenswrapper[4697]: E0220 16:55:13.410400 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5224cc9f-d610-4ea0-94da-11cdb019dcce" containerName="rabbitmq" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.410408 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="5224cc9f-d610-4ea0-94da-11cdb019dcce" containerName="rabbitmq" Feb 20 16:55:13 crc kubenswrapper[4697]: E0220 16:55:13.410427 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ca67b4-1eb6-40a6-ad33-1982ed83eb63" containerName="setup-container" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.410529 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ca67b4-1eb6-40a6-ad33-1982ed83eb63" containerName="setup-container" Feb 20 16:55:13 crc kubenswrapper[4697]: E0220 16:55:13.410548 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5224cc9f-d610-4ea0-94da-11cdb019dcce" containerName="setup-container" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.410557 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="5224cc9f-d610-4ea0-94da-11cdb019dcce" containerName="setup-container" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.410799 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="5224cc9f-d610-4ea0-94da-11cdb019dcce" containerName="rabbitmq" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.410827 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ca67b4-1eb6-40a6-ad33-1982ed83eb63" containerName="rabbitmq" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.411946 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.413825 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.425263 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.425635 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vxlxq" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.425766 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.425877 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.425976 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.426276 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.436796 4697 scope.go:117] "RemoveContainer" containerID="0cc80e2fc351be7479162c4f999b086841bc3cf0512d270e2d2a9e8c622f67e6" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.437140 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.450682 4697 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/40ca67b4-1eb6-40a6-ad33-1982ed83eb63-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.478079 4697 scope.go:117] "RemoveContainer" containerID="55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf" Feb 20 16:55:13 crc kubenswrapper[4697]: E0220 16:55:13.479254 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf\": container with ID starting with 55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf not found: ID does not exist" containerID="55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.479307 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf"} err="failed to get container status \"55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf\": rpc error: code = NotFound desc = could not find container \"55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf\": container with ID starting with 55b474b066f3c8bc4fd3b997cf4b7f1048936101005744c4f9e613310c48dbcf not found: ID does not exist" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.479327 4697 scope.go:117] "RemoveContainer" containerID="0cc80e2fc351be7479162c4f999b086841bc3cf0512d270e2d2a9e8c622f67e6" Feb 20 16:55:13 crc kubenswrapper[4697]: E0220 16:55:13.481419 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc80e2fc351be7479162c4f999b086841bc3cf0512d270e2d2a9e8c622f67e6\": container with ID starting with 0cc80e2fc351be7479162c4f999b086841bc3cf0512d270e2d2a9e8c622f67e6 not found: ID does not exist" containerID="0cc80e2fc351be7479162c4f999b086841bc3cf0512d270e2d2a9e8c622f67e6" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.481461 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc80e2fc351be7479162c4f999b086841bc3cf0512d270e2d2a9e8c622f67e6"} err="failed to get container status \"0cc80e2fc351be7479162c4f999b086841bc3cf0512d270e2d2a9e8c622f67e6\": rpc error: code = NotFound desc = could not find container \"0cc80e2fc351be7479162c4f999b086841bc3cf0512d270e2d2a9e8c622f67e6\": container with ID starting with 0cc80e2fc351be7479162c4f999b086841bc3cf0512d270e2d2a9e8c622f67e6 not found: ID does not exist" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.498489 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9z6s8"] Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.500922 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.506691 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9z6s8"] Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.552133 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9972c409-92f3-4ec7-9b59-cccd334b761e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.552274 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9972c409-92f3-4ec7-9b59-cccd334b761e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.552478 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9972c409-92f3-4ec7-9b59-cccd334b761e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.552579 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9972c409-92f3-4ec7-9b59-cccd334b761e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.552643 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.552733 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql49c\" (UniqueName: \"kubernetes.io/projected/9972c409-92f3-4ec7-9b59-cccd334b761e-kube-api-access-ql49c\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.552810 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9972c409-92f3-4ec7-9b59-cccd334b761e-config-data\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.552954 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9972c409-92f3-4ec7-9b59-cccd334b761e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.553010 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9972c409-92f3-4ec7-9b59-cccd334b761e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.553060 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9972c409-92f3-4ec7-9b59-cccd334b761e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.553112 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9972c409-92f3-4ec7-9b59-cccd334b761e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668089 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9972c409-92f3-4ec7-9b59-cccd334b761e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668131 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b9e859-201c-4dbb-81a0-f1ad8a181126-utilities\") pod \"community-operators-9z6s8\" (UID: \"91b9e859-201c-4dbb-81a0-f1ad8a181126\") " pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668157 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9972c409-92f3-4ec7-9b59-cccd334b761e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668198 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9972c409-92f3-4ec7-9b59-cccd334b761e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668223 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b9e859-201c-4dbb-81a0-f1ad8a181126-catalog-content\") pod \"community-operators-9z6s8\" (UID: \"91b9e859-201c-4dbb-81a0-f1ad8a181126\") " pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668244 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9972c409-92f3-4ec7-9b59-cccd334b761e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668266 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668303 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wnh9\" (UniqueName: \"kubernetes.io/projected/91b9e859-201c-4dbb-81a0-f1ad8a181126-kube-api-access-2wnh9\") pod \"community-operators-9z6s8\" (UID: \"91b9e859-201c-4dbb-81a0-f1ad8a181126\") " pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668323 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql49c\" (UniqueName: \"kubernetes.io/projected/9972c409-92f3-4ec7-9b59-cccd334b761e-kube-api-access-ql49c\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668349 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9972c409-92f3-4ec7-9b59-cccd334b761e-config-data\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668391 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9972c409-92f3-4ec7-9b59-cccd334b761e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668414 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9972c409-92f3-4ec7-9b59-cccd334b761e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668451 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9972c409-92f3-4ec7-9b59-cccd334b761e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.668480 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9972c409-92f3-4ec7-9b59-cccd334b761e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.669399 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.670334 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9972c409-92f3-4ec7-9b59-cccd334b761e-config-data\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.673777 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9972c409-92f3-4ec7-9b59-cccd334b761e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.673865 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9972c409-92f3-4ec7-9b59-cccd334b761e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.669413 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9972c409-92f3-4ec7-9b59-cccd334b761e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.674295 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9972c409-92f3-4ec7-9b59-cccd334b761e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.674785 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9972c409-92f3-4ec7-9b59-cccd334b761e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.676195 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9972c409-92f3-4ec7-9b59-cccd334b761e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.678036 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.679662 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9972c409-92f3-4ec7-9b59-cccd334b761e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.681077 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9972c409-92f3-4ec7-9b59-cccd334b761e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.698418 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.705692 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql49c\" (UniqueName: \"kubernetes.io/projected/9972c409-92f3-4ec7-9b59-cccd334b761e-kube-api-access-ql49c\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.710603 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.712725 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.720203 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lcn8w" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.720412 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.720555 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.720680 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.720810 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.720924 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.721051 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.721351 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.747577 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"9972c409-92f3-4ec7-9b59-cccd334b761e\") " pod="openstack/rabbitmq-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.770176 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b9e859-201c-4dbb-81a0-f1ad8a181126-utilities\") pod \"community-operators-9z6s8\" (UID: \"91b9e859-201c-4dbb-81a0-f1ad8a181126\") " pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.770250 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b9e859-201c-4dbb-81a0-f1ad8a181126-catalog-content\") pod \"community-operators-9z6s8\" (UID: \"91b9e859-201c-4dbb-81a0-f1ad8a181126\") " pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.770289 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wnh9\" (UniqueName: \"kubernetes.io/projected/91b9e859-201c-4dbb-81a0-f1ad8a181126-kube-api-access-2wnh9\") pod \"community-operators-9z6s8\" (UID: \"91b9e859-201c-4dbb-81a0-f1ad8a181126\") " pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.771143 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b9e859-201c-4dbb-81a0-f1ad8a181126-utilities\") pod \"community-operators-9z6s8\" (UID: \"91b9e859-201c-4dbb-81a0-f1ad8a181126\") " pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.772042 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b9e859-201c-4dbb-81a0-f1ad8a181126-catalog-content\") pod \"community-operators-9z6s8\" (UID: \"91b9e859-201c-4dbb-81a0-f1ad8a181126\") " pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.785728 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wnh9\" (UniqueName: \"kubernetes.io/projected/91b9e859-201c-4dbb-81a0-f1ad8a181126-kube-api-access-2wnh9\") pod \"community-operators-9z6s8\" (UID: \"91b9e859-201c-4dbb-81a0-f1ad8a181126\") " pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.822483 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.872338 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/12a44ac2-6e80-4bac-9079-0b6637de700a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.872685 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/12a44ac2-6e80-4bac-9079-0b6637de700a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.872708 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/12a44ac2-6e80-4bac-9079-0b6637de700a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.872726 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/12a44ac2-6e80-4bac-9079-0b6637de700a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.872745 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/12a44ac2-6e80-4bac-9079-0b6637de700a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.872765 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/12a44ac2-6e80-4bac-9079-0b6637de700a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.872990 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.873086 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhsl2\" (UniqueName: \"kubernetes.io/projected/12a44ac2-6e80-4bac-9079-0b6637de700a-kube-api-access-nhsl2\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.873133 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12a44ac2-6e80-4bac-9079-0b6637de700a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.873176 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/12a44ac2-6e80-4bac-9079-0b6637de700a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.873249 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/12a44ac2-6e80-4bac-9079-0b6637de700a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.988150 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhsl2\" (UniqueName: \"kubernetes.io/projected/12a44ac2-6e80-4bac-9079-0b6637de700a-kube-api-access-nhsl2\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.988207 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12a44ac2-6e80-4bac-9079-0b6637de700a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.988232 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/12a44ac2-6e80-4bac-9079-0b6637de700a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.988258 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/12a44ac2-6e80-4bac-9079-0b6637de700a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.988352 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/12a44ac2-6e80-4bac-9079-0b6637de700a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.988401 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/12a44ac2-6e80-4bac-9079-0b6637de700a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.988416 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/12a44ac2-6e80-4bac-9079-0b6637de700a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.988451 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/12a44ac2-6e80-4bac-9079-0b6637de700a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.988472 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/12a44ac2-6e80-4bac-9079-0b6637de700a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.988494 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/12a44ac2-6e80-4bac-9079-0b6637de700a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.988535 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.988988 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.996390 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/12a44ac2-6e80-4bac-9079-0b6637de700a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.996575 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/12a44ac2-6e80-4bac-9079-0b6637de700a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:13 crc kubenswrapper[4697]: I0220 16:55:13.997646 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12a44ac2-6e80-4bac-9079-0b6637de700a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.000811 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/12a44ac2-6e80-4bac-9079-0b6637de700a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.005894 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/12a44ac2-6e80-4bac-9079-0b6637de700a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.007376 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/12a44ac2-6e80-4bac-9079-0b6637de700a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.008044 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/12a44ac2-6e80-4bac-9079-0b6637de700a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.010067 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/12a44ac2-6e80-4bac-9079-0b6637de700a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.023604 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/12a44ac2-6e80-4bac-9079-0b6637de700a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.027010 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhsl2\" (UniqueName: \"kubernetes.io/projected/12a44ac2-6e80-4bac-9079-0b6637de700a-kube-api-access-nhsl2\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.030583 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"12a44ac2-6e80-4bac-9079-0b6637de700a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.046053 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.098300 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9z6s8"] Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.338010 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.352541 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z6s8" event={"ID":"91b9e859-201c-4dbb-81a0-f1ad8a181126","Type":"ContainerStarted","Data":"64994de0099d4650e66a7c1959e18b211f8c092f7aa0831437a6871bcaa40871"} Feb 20 16:55:14 crc kubenswrapper[4697]: W0220 16:55:14.536020 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9972c409_92f3_4ec7_9b59_cccd334b761e.slice/crio-f695e0280a3e45eeb665b6bbb7c1e3a3ad47bc91a460492789f3affd324dbbac WatchSource:0}: Error finding container f695e0280a3e45eeb665b6bbb7c1e3a3ad47bc91a460492789f3affd324dbbac: Status 404 returned error can't find the container with id f695e0280a3e45eeb665b6bbb7c1e3a3ad47bc91a460492789f3affd324dbbac Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.536187 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.840618 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.888604 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40ca67b4-1eb6-40a6-ad33-1982ed83eb63" path="/var/lib/kubelet/pods/40ca67b4-1eb6-40a6-ad33-1982ed83eb63/volumes" Feb 20 16:55:14 crc kubenswrapper[4697]: I0220 16:55:14.890186 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5224cc9f-d610-4ea0-94da-11cdb019dcce" path="/var/lib/kubelet/pods/5224cc9f-d610-4ea0-94da-11cdb019dcce/volumes" Feb 20 16:55:15 crc kubenswrapper[4697]: I0220 16:55:15.366663 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9972c409-92f3-4ec7-9b59-cccd334b761e","Type":"ContainerStarted","Data":"f695e0280a3e45eeb665b6bbb7c1e3a3ad47bc91a460492789f3affd324dbbac"} Feb 20 16:55:15 crc kubenswrapper[4697]: I0220 16:55:15.370608 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"12a44ac2-6e80-4bac-9079-0b6637de700a","Type":"ContainerStarted","Data":"4d657c39affeb0d82b45cbce24dc7beb7796ad46b20feaff022e96069d2b4c2c"} Feb 20 16:55:15 crc kubenswrapper[4697]: I0220 16:55:15.372638 4697 generic.go:334] "Generic (PLEG): container finished" podID="91b9e859-201c-4dbb-81a0-f1ad8a181126" containerID="c26af1e1d3f4d28cb362b5f3f774ec834bfb0f89a82d788008651ba8cbd637ac" exitCode=0 Feb 20 16:55:15 crc kubenswrapper[4697]: I0220 16:55:15.372666 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z6s8" event={"ID":"91b9e859-201c-4dbb-81a0-f1ad8a181126","Type":"ContainerDied","Data":"c26af1e1d3f4d28cb362b5f3f774ec834bfb0f89a82d788008651ba8cbd637ac"} Feb 20 16:55:16 crc kubenswrapper[4697]: I0220 16:55:16.384407 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z6s8" event={"ID":"91b9e859-201c-4dbb-81a0-f1ad8a181126","Type":"ContainerStarted","Data":"58a204e4e86c7293227ef43084c8213ba4b8f7b1b646b4d5d5452a69425672e1"} Feb 20 16:55:16 crc kubenswrapper[4697]: I0220 16:55:16.385983 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9972c409-92f3-4ec7-9b59-cccd334b761e","Type":"ContainerStarted","Data":"dbda7ffc5d30d170a0b6ab673c444b9487d1ddac543a474d6181b5b0e3fb807f"} Feb 20 16:55:17 crc kubenswrapper[4697]: I0220 16:55:17.398247 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"12a44ac2-6e80-4bac-9079-0b6637de700a","Type":"ContainerStarted","Data":"9e2d8c90491b6a82a9e4c0c3a143bf48772c139f3e1a14c555bb46a7e851fd9f"} Feb 20 16:55:17 crc kubenswrapper[4697]: I0220 16:55:17.400531 4697 generic.go:334] "Generic (PLEG): container finished" podID="91b9e859-201c-4dbb-81a0-f1ad8a181126" containerID="58a204e4e86c7293227ef43084c8213ba4b8f7b1b646b4d5d5452a69425672e1" exitCode=0 Feb 20 16:55:17 crc kubenswrapper[4697]: I0220 16:55:17.400646 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z6s8" event={"ID":"91b9e859-201c-4dbb-81a0-f1ad8a181126","Type":"ContainerDied","Data":"58a204e4e86c7293227ef43084c8213ba4b8f7b1b646b4d5d5452a69425672e1"} Feb 20 16:55:18 crc kubenswrapper[4697]: I0220 16:55:18.414561 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z6s8" event={"ID":"91b9e859-201c-4dbb-81a0-f1ad8a181126","Type":"ContainerStarted","Data":"24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f"} Feb 20 16:55:18 crc kubenswrapper[4697]: I0220 16:55:18.446905 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9z6s8" podStartSLOduration=3.025266993 podStartE2EDuration="5.446883582s" podCreationTimestamp="2026-02-20 16:55:13 +0000 UTC" firstStartedPulling="2026-02-20 16:55:15.375028939 +0000 UTC m=+1423.155074347" lastFinishedPulling="2026-02-20 16:55:17.796645518 +0000 UTC m=+1425.576690936" observedRunningTime="2026-02-20 16:55:18.446495503 +0000 UTC m=+1426.226540931" watchObservedRunningTime="2026-02-20 16:55:18.446883582 +0000 UTC m=+1426.226929000" Feb 20 16:55:20 crc kubenswrapper[4697]: I0220 16:55:20.818018 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66cb5fd775-xzjvs"] Feb 20 16:55:20 crc kubenswrapper[4697]: I0220 16:55:20.819898 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:20 crc kubenswrapper[4697]: I0220 16:55:20.822671 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 20 16:55:20 crc kubenswrapper[4697]: I0220 16:55:20.840515 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66cb5fd775-xzjvs"] Feb 20 16:55:20 crc kubenswrapper[4697]: I0220 16:55:20.919910 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-ovsdbserver-sb\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:20 crc kubenswrapper[4697]: I0220 16:55:20.919956 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-openstack-edpm-ipam\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:20 crc kubenswrapper[4697]: I0220 16:55:20.919990 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-dns-svc\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:20 crc kubenswrapper[4697]: I0220 16:55:20.920016 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-config\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:20 crc kubenswrapper[4697]: I0220 16:55:20.920077 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxlm6\" (UniqueName: \"kubernetes.io/projected/49f94458-d502-435e-91dc-948077f06813-kube-api-access-fxlm6\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:20 crc kubenswrapper[4697]: I0220 16:55:20.920121 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-dns-swift-storage-0\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:20 crc kubenswrapper[4697]: I0220 16:55:20.920145 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-ovsdbserver-nb\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.021840 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-ovsdbserver-sb\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.021878 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-openstack-edpm-ipam\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.021916 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-dns-svc\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.021954 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-config\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.022013 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxlm6\" (UniqueName: \"kubernetes.io/projected/49f94458-d502-435e-91dc-948077f06813-kube-api-access-fxlm6\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.022045 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-dns-swift-storage-0\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.022068 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-ovsdbserver-nb\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.022863 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-ovsdbserver-sb\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.022926 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-dns-svc\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.023101 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-openstack-edpm-ipam\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.023244 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-ovsdbserver-nb\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.023292 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-dns-swift-storage-0\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.023294 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-config\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.057727 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxlm6\" (UniqueName: \"kubernetes.io/projected/49f94458-d502-435e-91dc-948077f06813-kube-api-access-fxlm6\") pod \"dnsmasq-dns-66cb5fd775-xzjvs\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.172755 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:21 crc kubenswrapper[4697]: I0220 16:55:21.655766 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66cb5fd775-xzjvs"] Feb 20 16:55:21 crc kubenswrapper[4697]: W0220 16:55:21.656575 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49f94458_d502_435e_91dc_948077f06813.slice/crio-165410bd316d877c6ab91e324ef623910c0d15e26f633efc84eb9c2c268d9a5e WatchSource:0}: Error finding container 165410bd316d877c6ab91e324ef623910c0d15e26f633efc84eb9c2c268d9a5e: Status 404 returned error can't find the container with id 165410bd316d877c6ab91e324ef623910c0d15e26f633efc84eb9c2c268d9a5e Feb 20 16:55:22 crc kubenswrapper[4697]: I0220 16:55:22.458351 4697 generic.go:334] "Generic (PLEG): container finished" podID="49f94458-d502-435e-91dc-948077f06813" containerID="6325e90c8a7faceae377512070fb2145f68815fc5621e0c34acf1aa289cf436c" exitCode=0 Feb 20 16:55:22 crc kubenswrapper[4697]: I0220 16:55:22.458454 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" event={"ID":"49f94458-d502-435e-91dc-948077f06813","Type":"ContainerDied","Data":"6325e90c8a7faceae377512070fb2145f68815fc5621e0c34acf1aa289cf436c"} Feb 20 16:55:22 crc kubenswrapper[4697]: I0220 16:55:22.458731 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" event={"ID":"49f94458-d502-435e-91dc-948077f06813","Type":"ContainerStarted","Data":"165410bd316d877c6ab91e324ef623910c0d15e26f633efc84eb9c2c268d9a5e"} Feb 20 16:55:23 crc kubenswrapper[4697]: I0220 16:55:23.471172 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" event={"ID":"49f94458-d502-435e-91dc-948077f06813","Type":"ContainerStarted","Data":"2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f"} Feb 20 16:55:23 crc kubenswrapper[4697]: I0220 16:55:23.471716 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:23 crc kubenswrapper[4697]: I0220 16:55:23.498868 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" podStartSLOduration=3.498850431 podStartE2EDuration="3.498850431s" podCreationTimestamp="2026-02-20 16:55:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:55:23.491592955 +0000 UTC m=+1431.271638373" watchObservedRunningTime="2026-02-20 16:55:23.498850431 +0000 UTC m=+1431.278895839" Feb 20 16:55:23 crc kubenswrapper[4697]: I0220 16:55:23.823835 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:23 crc kubenswrapper[4697]: I0220 16:55:23.823889 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:23 crc kubenswrapper[4697]: I0220 16:55:23.885564 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:24 crc kubenswrapper[4697]: I0220 16:55:24.526670 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:24 crc kubenswrapper[4697]: I0220 16:55:24.581700 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9z6s8"] Feb 20 16:55:26 crc kubenswrapper[4697]: I0220 16:55:26.498530 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9z6s8" podUID="91b9e859-201c-4dbb-81a0-f1ad8a181126" containerName="registry-server" containerID="cri-o://24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f" gracePeriod=2 Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.007637 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.154394 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wnh9\" (UniqueName: \"kubernetes.io/projected/91b9e859-201c-4dbb-81a0-f1ad8a181126-kube-api-access-2wnh9\") pod \"91b9e859-201c-4dbb-81a0-f1ad8a181126\" (UID: \"91b9e859-201c-4dbb-81a0-f1ad8a181126\") " Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.154917 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b9e859-201c-4dbb-81a0-f1ad8a181126-utilities\") pod \"91b9e859-201c-4dbb-81a0-f1ad8a181126\" (UID: \"91b9e859-201c-4dbb-81a0-f1ad8a181126\") " Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.154967 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b9e859-201c-4dbb-81a0-f1ad8a181126-catalog-content\") pod \"91b9e859-201c-4dbb-81a0-f1ad8a181126\" (UID: \"91b9e859-201c-4dbb-81a0-f1ad8a181126\") " Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.155769 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91b9e859-201c-4dbb-81a0-f1ad8a181126-utilities" (OuterVolumeSpecName: "utilities") pod "91b9e859-201c-4dbb-81a0-f1ad8a181126" (UID: "91b9e859-201c-4dbb-81a0-f1ad8a181126"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.167156 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91b9e859-201c-4dbb-81a0-f1ad8a181126-kube-api-access-2wnh9" (OuterVolumeSpecName: "kube-api-access-2wnh9") pod "91b9e859-201c-4dbb-81a0-f1ad8a181126" (UID: "91b9e859-201c-4dbb-81a0-f1ad8a181126"). InnerVolumeSpecName "kube-api-access-2wnh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.211222 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91b9e859-201c-4dbb-81a0-f1ad8a181126-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91b9e859-201c-4dbb-81a0-f1ad8a181126" (UID: "91b9e859-201c-4dbb-81a0-f1ad8a181126"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.257708 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b9e859-201c-4dbb-81a0-f1ad8a181126-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.257734 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b9e859-201c-4dbb-81a0-f1ad8a181126-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.257757 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wnh9\" (UniqueName: \"kubernetes.io/projected/91b9e859-201c-4dbb-81a0-f1ad8a181126-kube-api-access-2wnh9\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.510689 4697 generic.go:334] "Generic (PLEG): container finished" podID="91b9e859-201c-4dbb-81a0-f1ad8a181126" containerID="24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f" exitCode=0 Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.510742 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9z6s8" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.510756 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z6s8" event={"ID":"91b9e859-201c-4dbb-81a0-f1ad8a181126","Type":"ContainerDied","Data":"24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f"} Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.510804 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z6s8" event={"ID":"91b9e859-201c-4dbb-81a0-f1ad8a181126","Type":"ContainerDied","Data":"64994de0099d4650e66a7c1959e18b211f8c092f7aa0831437a6871bcaa40871"} Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.510834 4697 scope.go:117] "RemoveContainer" containerID="24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.534863 4697 scope.go:117] "RemoveContainer" containerID="58a204e4e86c7293227ef43084c8213ba4b8f7b1b646b4d5d5452a69425672e1" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.566635 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9z6s8"] Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.582554 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9z6s8"] Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.587833 4697 scope.go:117] "RemoveContainer" containerID="c26af1e1d3f4d28cb362b5f3f774ec834bfb0f89a82d788008651ba8cbd637ac" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.627255 4697 scope.go:117] "RemoveContainer" containerID="24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f" Feb 20 16:55:27 crc kubenswrapper[4697]: E0220 16:55:27.627682 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f\": container with ID starting with 24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f not found: ID does not exist" containerID="24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.627738 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f"} err="failed to get container status \"24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f\": rpc error: code = NotFound desc = could not find container \"24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f\": container with ID starting with 24e3dc476a829df959935ece726a905864430fc635cca82cee7aa4499bd31f8f not found: ID does not exist" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.627891 4697 scope.go:117] "RemoveContainer" containerID="58a204e4e86c7293227ef43084c8213ba4b8f7b1b646b4d5d5452a69425672e1" Feb 20 16:55:27 crc kubenswrapper[4697]: E0220 16:55:27.628294 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58a204e4e86c7293227ef43084c8213ba4b8f7b1b646b4d5d5452a69425672e1\": container with ID starting with 58a204e4e86c7293227ef43084c8213ba4b8f7b1b646b4d5d5452a69425672e1 not found: ID does not exist" containerID="58a204e4e86c7293227ef43084c8213ba4b8f7b1b646b4d5d5452a69425672e1" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.628346 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a204e4e86c7293227ef43084c8213ba4b8f7b1b646b4d5d5452a69425672e1"} err="failed to get container status \"58a204e4e86c7293227ef43084c8213ba4b8f7b1b646b4d5d5452a69425672e1\": rpc error: code = NotFound desc = could not find container \"58a204e4e86c7293227ef43084c8213ba4b8f7b1b646b4d5d5452a69425672e1\": container with ID starting with 58a204e4e86c7293227ef43084c8213ba4b8f7b1b646b4d5d5452a69425672e1 not found: ID does not exist" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.628363 4697 scope.go:117] "RemoveContainer" containerID="c26af1e1d3f4d28cb362b5f3f774ec834bfb0f89a82d788008651ba8cbd637ac" Feb 20 16:55:27 crc kubenswrapper[4697]: E0220 16:55:27.628665 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26af1e1d3f4d28cb362b5f3f774ec834bfb0f89a82d788008651ba8cbd637ac\": container with ID starting with c26af1e1d3f4d28cb362b5f3f774ec834bfb0f89a82d788008651ba8cbd637ac not found: ID does not exist" containerID="c26af1e1d3f4d28cb362b5f3f774ec834bfb0f89a82d788008651ba8cbd637ac" Feb 20 16:55:27 crc kubenswrapper[4697]: I0220 16:55:27.628693 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26af1e1d3f4d28cb362b5f3f774ec834bfb0f89a82d788008651ba8cbd637ac"} err="failed to get container status \"c26af1e1d3f4d28cb362b5f3f774ec834bfb0f89a82d788008651ba8cbd637ac\": rpc error: code = NotFound desc = could not find container \"c26af1e1d3f4d28cb362b5f3f774ec834bfb0f89a82d788008651ba8cbd637ac\": container with ID starting with c26af1e1d3f4d28cb362b5f3f774ec834bfb0f89a82d788008651ba8cbd637ac not found: ID does not exist" Feb 20 16:55:28 crc kubenswrapper[4697]: I0220 16:55:28.894125 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91b9e859-201c-4dbb-81a0-f1ad8a181126" path="/var/lib/kubelet/pods/91b9e859-201c-4dbb-81a0-f1ad8a181126/volumes" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.174832 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.185020 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.185074 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.246835 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f7757544c-2rfnw"] Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.404361 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5955df7555-qdwfx"] Feb 20 16:55:31 crc kubenswrapper[4697]: E0220 16:55:31.405055 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b9e859-201c-4dbb-81a0-f1ad8a181126" containerName="registry-server" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.405145 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b9e859-201c-4dbb-81a0-f1ad8a181126" containerName="registry-server" Feb 20 16:55:31 crc kubenswrapper[4697]: E0220 16:55:31.405231 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b9e859-201c-4dbb-81a0-f1ad8a181126" containerName="extract-utilities" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.405303 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b9e859-201c-4dbb-81a0-f1ad8a181126" containerName="extract-utilities" Feb 20 16:55:31 crc kubenswrapper[4697]: E0220 16:55:31.405383 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b9e859-201c-4dbb-81a0-f1ad8a181126" containerName="extract-content" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.405450 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b9e859-201c-4dbb-81a0-f1ad8a181126" containerName="extract-content" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.405692 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="91b9e859-201c-4dbb-81a0-f1ad8a181126" containerName="registry-server" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.406791 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.441316 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5955df7555-qdwfx"] Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.547813 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-config\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.547888 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-dns-svc\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.548086 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-openstack-edpm-ipam\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.548159 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-ovsdbserver-sb\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.548179 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-ovsdbserver-nb\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.548289 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdpl5\" (UniqueName: \"kubernetes.io/projected/af760d37-5cb2-4378-811d-cf343b5c9faf-kube-api-access-pdpl5\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.548382 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-dns-swift-storage-0\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.586640 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" podUID="4638cd5a-b95f-482f-b554-cd95fddfa551" containerName="dnsmasq-dns" containerID="cri-o://6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463" gracePeriod=10 Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.650416 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-ovsdbserver-sb\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.650493 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-ovsdbserver-nb\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.650591 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdpl5\" (UniqueName: \"kubernetes.io/projected/af760d37-5cb2-4378-811d-cf343b5c9faf-kube-api-access-pdpl5\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.650667 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-dns-swift-storage-0\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.650755 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-config\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.650832 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-dns-svc\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.650933 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-openstack-edpm-ipam\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.651600 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-ovsdbserver-nb\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.651628 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-ovsdbserver-sb\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.651747 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-openstack-edpm-ipam\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.651833 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-config\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.652347 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-dns-swift-storage-0\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.652428 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af760d37-5cb2-4378-811d-cf343b5c9faf-dns-svc\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.681205 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdpl5\" (UniqueName: \"kubernetes.io/projected/af760d37-5cb2-4378-811d-cf343b5c9faf-kube-api-access-pdpl5\") pod \"dnsmasq-dns-5955df7555-qdwfx\" (UID: \"af760d37-5cb2-4378-811d-cf343b5c9faf\") " pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:31 crc kubenswrapper[4697]: I0220 16:55:31.726355 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.148858 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.275775 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5955df7555-qdwfx"] Feb 20 16:55:32 crc kubenswrapper[4697]: W0220 16:55:32.277523 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf760d37_5cb2_4378_811d_cf343b5c9faf.slice/crio-5516c2554909a2369fbc0b0bd1ecf59c3b3a9154600d4045ed0b4d175c0ccd41 WatchSource:0}: Error finding container 5516c2554909a2369fbc0b0bd1ecf59c3b3a9154600d4045ed0b4d175c0ccd41: Status 404 returned error can't find the container with id 5516c2554909a2369fbc0b0bd1ecf59c3b3a9154600d4045ed0b4d175c0ccd41 Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.280495 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-ovsdbserver-nb\") pod \"4638cd5a-b95f-482f-b554-cd95fddfa551\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.280551 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvq74\" (UniqueName: \"kubernetes.io/projected/4638cd5a-b95f-482f-b554-cd95fddfa551-kube-api-access-kvq74\") pod \"4638cd5a-b95f-482f-b554-cd95fddfa551\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.280647 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-dns-svc\") pod \"4638cd5a-b95f-482f-b554-cd95fddfa551\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.280680 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-dns-swift-storage-0\") pod \"4638cd5a-b95f-482f-b554-cd95fddfa551\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.280738 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-ovsdbserver-sb\") pod \"4638cd5a-b95f-482f-b554-cd95fddfa551\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.280775 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-config\") pod \"4638cd5a-b95f-482f-b554-cd95fddfa551\" (UID: \"4638cd5a-b95f-482f-b554-cd95fddfa551\") " Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.302913 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4638cd5a-b95f-482f-b554-cd95fddfa551-kube-api-access-kvq74" (OuterVolumeSpecName: "kube-api-access-kvq74") pod "4638cd5a-b95f-482f-b554-cd95fddfa551" (UID: "4638cd5a-b95f-482f-b554-cd95fddfa551"). InnerVolumeSpecName "kube-api-access-kvq74". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.338774 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-config" (OuterVolumeSpecName: "config") pod "4638cd5a-b95f-482f-b554-cd95fddfa551" (UID: "4638cd5a-b95f-482f-b554-cd95fddfa551"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.341608 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4638cd5a-b95f-482f-b554-cd95fddfa551" (UID: "4638cd5a-b95f-482f-b554-cd95fddfa551"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.348706 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4638cd5a-b95f-482f-b554-cd95fddfa551" (UID: "4638cd5a-b95f-482f-b554-cd95fddfa551"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.351373 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4638cd5a-b95f-482f-b554-cd95fddfa551" (UID: "4638cd5a-b95f-482f-b554-cd95fddfa551"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.353021 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4638cd5a-b95f-482f-b554-cd95fddfa551" (UID: "4638cd5a-b95f-482f-b554-cd95fddfa551"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.382576 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.382839 4697 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.382850 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.382859 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.382867 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4638cd5a-b95f-482f-b554-cd95fddfa551-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.382875 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvq74\" (UniqueName: \"kubernetes.io/projected/4638cd5a-b95f-482f-b554-cd95fddfa551-kube-api-access-kvq74\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.596074 4697 generic.go:334] "Generic (PLEG): container finished" podID="af760d37-5cb2-4378-811d-cf343b5c9faf" containerID="66157f0360ba7d867cf0afc5390261db572094a38ffd6fcec522cd83ebe2533f" exitCode=0 Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.596150 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5955df7555-qdwfx" event={"ID":"af760d37-5cb2-4378-811d-cf343b5c9faf","Type":"ContainerDied","Data":"66157f0360ba7d867cf0afc5390261db572094a38ffd6fcec522cd83ebe2533f"} Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.596176 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5955df7555-qdwfx" event={"ID":"af760d37-5cb2-4378-811d-cf343b5c9faf","Type":"ContainerStarted","Data":"5516c2554909a2369fbc0b0bd1ecf59c3b3a9154600d4045ed0b4d175c0ccd41"} Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.598994 4697 generic.go:334] "Generic (PLEG): container finished" podID="4638cd5a-b95f-482f-b554-cd95fddfa551" containerID="6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463" exitCode=0 Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.599030 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" event={"ID":"4638cd5a-b95f-482f-b554-cd95fddfa551","Type":"ContainerDied","Data":"6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463"} Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.599093 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" event={"ID":"4638cd5a-b95f-482f-b554-cd95fddfa551","Type":"ContainerDied","Data":"77a6a1b07f1ec54a68ba9528b961bcedbb245f68efc20c6cbc63255239ec56a3"} Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.599121 4697 scope.go:117] "RemoveContainer" containerID="6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.599254 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7757544c-2rfnw" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.639979 4697 scope.go:117] "RemoveContainer" containerID="903defd0b77d02cfd99bcfd7f32d7e5c35d0070e52a758a1d595c926fd828c50" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.660145 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f7757544c-2rfnw"] Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.670641 4697 scope.go:117] "RemoveContainer" containerID="6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463" Feb 20 16:55:32 crc kubenswrapper[4697]: E0220 16:55:32.673379 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463\": container with ID starting with 6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463 not found: ID does not exist" containerID="6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.673422 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463"} err="failed to get container status \"6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463\": rpc error: code = NotFound desc = could not find container \"6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463\": container with ID starting with 6670b8e08fa53059cf6755f21c4979826ce573e6fedf84db69ae4402a0365463 not found: ID does not exist" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.673491 4697 scope.go:117] "RemoveContainer" containerID="903defd0b77d02cfd99bcfd7f32d7e5c35d0070e52a758a1d595c926fd828c50" Feb 20 16:55:32 crc kubenswrapper[4697]: E0220 16:55:32.674236 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"903defd0b77d02cfd99bcfd7f32d7e5c35d0070e52a758a1d595c926fd828c50\": container with ID starting with 903defd0b77d02cfd99bcfd7f32d7e5c35d0070e52a758a1d595c926fd828c50 not found: ID does not exist" containerID="903defd0b77d02cfd99bcfd7f32d7e5c35d0070e52a758a1d595c926fd828c50" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.674270 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903defd0b77d02cfd99bcfd7f32d7e5c35d0070e52a758a1d595c926fd828c50"} err="failed to get container status \"903defd0b77d02cfd99bcfd7f32d7e5c35d0070e52a758a1d595c926fd828c50\": rpc error: code = NotFound desc = could not find container \"903defd0b77d02cfd99bcfd7f32d7e5c35d0070e52a758a1d595c926fd828c50\": container with ID starting with 903defd0b77d02cfd99bcfd7f32d7e5c35d0070e52a758a1d595c926fd828c50 not found: ID does not exist" Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.685994 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f7757544c-2rfnw"] Feb 20 16:55:32 crc kubenswrapper[4697]: I0220 16:55:32.888914 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4638cd5a-b95f-482f-b554-cd95fddfa551" path="/var/lib/kubelet/pods/4638cd5a-b95f-482f-b554-cd95fddfa551/volumes" Feb 20 16:55:33 crc kubenswrapper[4697]: I0220 16:55:33.617281 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5955df7555-qdwfx" event={"ID":"af760d37-5cb2-4378-811d-cf343b5c9faf","Type":"ContainerStarted","Data":"b8f377d5505f7848e9c30bb74fbf46f68db067325ef4bdbb83781fe3598a9313"} Feb 20 16:55:33 crc kubenswrapper[4697]: I0220 16:55:33.617687 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:33 crc kubenswrapper[4697]: I0220 16:55:33.639157 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5955df7555-qdwfx" podStartSLOduration=2.639142648 podStartE2EDuration="2.639142648s" podCreationTimestamp="2026-02-20 16:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:55:33.635533021 +0000 UTC m=+1441.415578419" watchObservedRunningTime="2026-02-20 16:55:33.639142648 +0000 UTC m=+1441.419188056" Feb 20 16:55:41 crc kubenswrapper[4697]: I0220 16:55:41.729607 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5955df7555-qdwfx" Feb 20 16:55:41 crc kubenswrapper[4697]: I0220 16:55:41.853133 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66cb5fd775-xzjvs"] Feb 20 16:55:41 crc kubenswrapper[4697]: I0220 16:55:41.853385 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" podUID="49f94458-d502-435e-91dc-948077f06813" containerName="dnsmasq-dns" containerID="cri-o://2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f" gracePeriod=10 Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.441247 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.587690 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-dns-svc\") pod \"49f94458-d502-435e-91dc-948077f06813\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.587862 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-ovsdbserver-nb\") pod \"49f94458-d502-435e-91dc-948077f06813\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.587936 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-ovsdbserver-sb\") pod \"49f94458-d502-435e-91dc-948077f06813\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.587953 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-config\") pod \"49f94458-d502-435e-91dc-948077f06813\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.588001 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxlm6\" (UniqueName: \"kubernetes.io/projected/49f94458-d502-435e-91dc-948077f06813-kube-api-access-fxlm6\") pod \"49f94458-d502-435e-91dc-948077f06813\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.588059 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-openstack-edpm-ipam\") pod \"49f94458-d502-435e-91dc-948077f06813\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.588079 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-dns-swift-storage-0\") pod \"49f94458-d502-435e-91dc-948077f06813\" (UID: \"49f94458-d502-435e-91dc-948077f06813\") " Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.602025 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f94458-d502-435e-91dc-948077f06813-kube-api-access-fxlm6" (OuterVolumeSpecName: "kube-api-access-fxlm6") pod "49f94458-d502-435e-91dc-948077f06813" (UID: "49f94458-d502-435e-91dc-948077f06813"). InnerVolumeSpecName "kube-api-access-fxlm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.636643 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "49f94458-d502-435e-91dc-948077f06813" (UID: "49f94458-d502-435e-91dc-948077f06813"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.637745 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "49f94458-d502-435e-91dc-948077f06813" (UID: "49f94458-d502-435e-91dc-948077f06813"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.643136 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "49f94458-d502-435e-91dc-948077f06813" (UID: "49f94458-d502-435e-91dc-948077f06813"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.644695 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "49f94458-d502-435e-91dc-948077f06813" (UID: "49f94458-d502-435e-91dc-948077f06813"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.646024 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "49f94458-d502-435e-91dc-948077f06813" (UID: "49f94458-d502-435e-91dc-948077f06813"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.649107 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-config" (OuterVolumeSpecName: "config") pod "49f94458-d502-435e-91dc-948077f06813" (UID: "49f94458-d502-435e-91dc-948077f06813"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.690722 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.690759 4697 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.690773 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-config\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.690785 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxlm6\" (UniqueName: \"kubernetes.io/projected/49f94458-d502-435e-91dc-948077f06813-kube-api-access-fxlm6\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.690799 4697 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.690811 4697 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.690821 4697 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49f94458-d502-435e-91dc-948077f06813-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.736819 4697 generic.go:334] "Generic (PLEG): container finished" podID="49f94458-d502-435e-91dc-948077f06813" containerID="2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f" exitCode=0 Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.736897 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.736892 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" event={"ID":"49f94458-d502-435e-91dc-948077f06813","Type":"ContainerDied","Data":"2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f"} Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.736969 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cb5fd775-xzjvs" event={"ID":"49f94458-d502-435e-91dc-948077f06813","Type":"ContainerDied","Data":"165410bd316d877c6ab91e324ef623910c0d15e26f633efc84eb9c2c268d9a5e"} Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.736995 4697 scope.go:117] "RemoveContainer" containerID="2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.771130 4697 scope.go:117] "RemoveContainer" containerID="6325e90c8a7faceae377512070fb2145f68815fc5621e0c34acf1aa289cf436c" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.779020 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66cb5fd775-xzjvs"] Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.790745 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66cb5fd775-xzjvs"] Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.798647 4697 scope.go:117] "RemoveContainer" containerID="2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f" Feb 20 16:55:42 crc kubenswrapper[4697]: E0220 16:55:42.799092 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f\": container with ID starting with 2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f not found: ID does not exist" containerID="2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.799121 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f"} err="failed to get container status \"2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f\": rpc error: code = NotFound desc = could not find container \"2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f\": container with ID starting with 2d1da43f120fe73a0c9fd239a1b3c56d1caf806f350fa1924fd1f48afc41027f not found: ID does not exist" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.799141 4697 scope.go:117] "RemoveContainer" containerID="6325e90c8a7faceae377512070fb2145f68815fc5621e0c34acf1aa289cf436c" Feb 20 16:55:42 crc kubenswrapper[4697]: E0220 16:55:42.799350 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6325e90c8a7faceae377512070fb2145f68815fc5621e0c34acf1aa289cf436c\": container with ID starting with 6325e90c8a7faceae377512070fb2145f68815fc5621e0c34acf1aa289cf436c not found: ID does not exist" containerID="6325e90c8a7faceae377512070fb2145f68815fc5621e0c34acf1aa289cf436c" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.799371 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6325e90c8a7faceae377512070fb2145f68815fc5621e0c34acf1aa289cf436c"} err="failed to get container status \"6325e90c8a7faceae377512070fb2145f68815fc5621e0c34acf1aa289cf436c\": rpc error: code = NotFound desc = could not find container \"6325e90c8a7faceae377512070fb2145f68815fc5621e0c34acf1aa289cf436c\": container with ID starting with 6325e90c8a7faceae377512070fb2145f68815fc5621e0c34acf1aa289cf436c not found: ID does not exist" Feb 20 16:55:42 crc kubenswrapper[4697]: I0220 16:55:42.889153 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f94458-d502-435e-91dc-948077f06813" path="/var/lib/kubelet/pods/49f94458-d502-435e-91dc-948077f06813/volumes" Feb 20 16:55:48 crc kubenswrapper[4697]: I0220 16:55:48.794025 4697 generic.go:334] "Generic (PLEG): container finished" podID="9972c409-92f3-4ec7-9b59-cccd334b761e" containerID="dbda7ffc5d30d170a0b6ab673c444b9487d1ddac543a474d6181b5b0e3fb807f" exitCode=0 Feb 20 16:55:48 crc kubenswrapper[4697]: I0220 16:55:48.794157 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9972c409-92f3-4ec7-9b59-cccd334b761e","Type":"ContainerDied","Data":"dbda7ffc5d30d170a0b6ab673c444b9487d1ddac543a474d6181b5b0e3fb807f"} Feb 20 16:55:48 crc kubenswrapper[4697]: I0220 16:55:48.797169 4697 generic.go:334] "Generic (PLEG): container finished" podID="12a44ac2-6e80-4bac-9079-0b6637de700a" containerID="9e2d8c90491b6a82a9e4c0c3a143bf48772c139f3e1a14c555bb46a7e851fd9f" exitCode=0 Feb 20 16:55:48 crc kubenswrapper[4697]: I0220 16:55:48.797214 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"12a44ac2-6e80-4bac-9079-0b6637de700a","Type":"ContainerDied","Data":"9e2d8c90491b6a82a9e4c0c3a143bf48772c139f3e1a14c555bb46a7e851fd9f"} Feb 20 16:55:49 crc kubenswrapper[4697]: I0220 16:55:49.809297 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9972c409-92f3-4ec7-9b59-cccd334b761e","Type":"ContainerStarted","Data":"c9f2956fe4679ed8b0ab4cdda7cbc333102cb95f6dee46571907c486531b7ee5"} Feb 20 16:55:49 crc kubenswrapper[4697]: I0220 16:55:49.809835 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 20 16:55:49 crc kubenswrapper[4697]: I0220 16:55:49.811682 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"12a44ac2-6e80-4bac-9079-0b6637de700a","Type":"ContainerStarted","Data":"069fe45ef70a9bdc114e3da5cc0cf5f72c34f7f75a9100ceb571ebb2557feec9"} Feb 20 16:55:49 crc kubenswrapper[4697]: I0220 16:55:49.811864 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:55:49 crc kubenswrapper[4697]: I0220 16:55:49.832379 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.83236413 podStartE2EDuration="36.83236413s" podCreationTimestamp="2026-02-20 16:55:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:55:49.82865902 +0000 UTC m=+1457.608704458" watchObservedRunningTime="2026-02-20 16:55:49.83236413 +0000 UTC m=+1457.612409538" Feb 20 16:55:49 crc kubenswrapper[4697]: I0220 16:55:49.857133 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.85711303 podStartE2EDuration="36.85711303s" podCreationTimestamp="2026-02-20 16:55:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 16:55:49.851613867 +0000 UTC m=+1457.631659315" watchObservedRunningTime="2026-02-20 16:55:49.85711303 +0000 UTC m=+1457.637158428" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.611144 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf"] Feb 20 16:55:59 crc kubenswrapper[4697]: E0220 16:55:59.612476 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f94458-d502-435e-91dc-948077f06813" containerName="init" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.612493 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f94458-d502-435e-91dc-948077f06813" containerName="init" Feb 20 16:55:59 crc kubenswrapper[4697]: E0220 16:55:59.612509 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4638cd5a-b95f-482f-b554-cd95fddfa551" containerName="dnsmasq-dns" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.612517 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638cd5a-b95f-482f-b554-cd95fddfa551" containerName="dnsmasq-dns" Feb 20 16:55:59 crc kubenswrapper[4697]: E0220 16:55:59.612553 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f94458-d502-435e-91dc-948077f06813" containerName="dnsmasq-dns" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.612561 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f94458-d502-435e-91dc-948077f06813" containerName="dnsmasq-dns" Feb 20 16:55:59 crc kubenswrapper[4697]: E0220 16:55:59.612583 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4638cd5a-b95f-482f-b554-cd95fddfa551" containerName="init" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.612590 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638cd5a-b95f-482f-b554-cd95fddfa551" containerName="init" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.612812 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f94458-d502-435e-91dc-948077f06813" containerName="dnsmasq-dns" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.612835 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="4638cd5a-b95f-482f-b554-cd95fddfa551" containerName="dnsmasq-dns" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.613736 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.622153 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.622191 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.622857 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.628152 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.634885 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf"] Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.647685 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6chxj\" (UniqueName: \"kubernetes.io/projected/143cf213-8274-47bd-b6f4-80f2d465275c-kube-api-access-6chxj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.647756 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.647846 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.647940 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.749112 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6chxj\" (UniqueName: \"kubernetes.io/projected/143cf213-8274-47bd-b6f4-80f2d465275c-kube-api-access-6chxj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.749173 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.749269 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.749358 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.756899 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.762972 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.765820 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.768602 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6chxj\" (UniqueName: \"kubernetes.io/projected/143cf213-8274-47bd-b6f4-80f2d465275c-kube-api-access-6chxj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:55:59 crc kubenswrapper[4697]: I0220 16:55:59.986555 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:56:00 crc kubenswrapper[4697]: I0220 16:56:00.820340 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf"] Feb 20 16:56:00 crc kubenswrapper[4697]: W0220 16:56:00.820640 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod143cf213_8274_47bd_b6f4_80f2d465275c.slice/crio-5b5322bf04ad5afc4101393a2973ae830d2323f41f2991bc06fab00b3ae578a4 WatchSource:0}: Error finding container 5b5322bf04ad5afc4101393a2973ae830d2323f41f2991bc06fab00b3ae578a4: Status 404 returned error can't find the container with id 5b5322bf04ad5afc4101393a2973ae830d2323f41f2991bc06fab00b3ae578a4 Feb 20 16:56:00 crc kubenswrapper[4697]: I0220 16:56:00.909723 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" event={"ID":"143cf213-8274-47bd-b6f4-80f2d465275c","Type":"ContainerStarted","Data":"5b5322bf04ad5afc4101393a2973ae830d2323f41f2991bc06fab00b3ae578a4"} Feb 20 16:56:01 crc kubenswrapper[4697]: I0220 16:56:01.184702 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:56:01 crc kubenswrapper[4697]: I0220 16:56:01.184781 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:56:04 crc kubenswrapper[4697]: I0220 16:56:04.049609 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 20 16:56:04 crc kubenswrapper[4697]: I0220 16:56:04.342646 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 20 16:56:10 crc kubenswrapper[4697]: I0220 16:56:10.997725 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" event={"ID":"143cf213-8274-47bd-b6f4-80f2d465275c","Type":"ContainerStarted","Data":"52b60470c61a646ab0609904fbe6e032ca2d090f4dedd160fef2bd7e96316b32"} Feb 20 16:56:11 crc kubenswrapper[4697]: I0220 16:56:11.024904 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" podStartSLOduration=3.057568244 podStartE2EDuration="12.024878055s" podCreationTimestamp="2026-02-20 16:55:59 +0000 UTC" firstStartedPulling="2026-02-20 16:56:00.823951337 +0000 UTC m=+1468.603996745" lastFinishedPulling="2026-02-20 16:56:09.791261148 +0000 UTC m=+1477.571306556" observedRunningTime="2026-02-20 16:56:11.015685992 +0000 UTC m=+1478.795731430" watchObservedRunningTime="2026-02-20 16:56:11.024878055 +0000 UTC m=+1478.804923503" Feb 20 16:56:21 crc kubenswrapper[4697]: I0220 16:56:21.100704 4697 generic.go:334] "Generic (PLEG): container finished" podID="143cf213-8274-47bd-b6f4-80f2d465275c" containerID="52b60470c61a646ab0609904fbe6e032ca2d090f4dedd160fef2bd7e96316b32" exitCode=0 Feb 20 16:56:21 crc kubenswrapper[4697]: I0220 16:56:21.100775 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" event={"ID":"143cf213-8274-47bd-b6f4-80f2d465275c","Type":"ContainerDied","Data":"52b60470c61a646ab0609904fbe6e032ca2d090f4dedd160fef2bd7e96316b32"} Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.661358 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.823127 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-ssh-key-openstack-edpm-ipam\") pod \"143cf213-8274-47bd-b6f4-80f2d465275c\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.823204 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-repo-setup-combined-ca-bundle\") pod \"143cf213-8274-47bd-b6f4-80f2d465275c\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.823457 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-inventory\") pod \"143cf213-8274-47bd-b6f4-80f2d465275c\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.823518 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6chxj\" (UniqueName: \"kubernetes.io/projected/143cf213-8274-47bd-b6f4-80f2d465275c-kube-api-access-6chxj\") pod \"143cf213-8274-47bd-b6f4-80f2d465275c\" (UID: \"143cf213-8274-47bd-b6f4-80f2d465275c\") " Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.831853 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "143cf213-8274-47bd-b6f4-80f2d465275c" (UID: "143cf213-8274-47bd-b6f4-80f2d465275c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.834699 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143cf213-8274-47bd-b6f4-80f2d465275c-kube-api-access-6chxj" (OuterVolumeSpecName: "kube-api-access-6chxj") pod "143cf213-8274-47bd-b6f4-80f2d465275c" (UID: "143cf213-8274-47bd-b6f4-80f2d465275c"). InnerVolumeSpecName "kube-api-access-6chxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.861423 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-inventory" (OuterVolumeSpecName: "inventory") pod "143cf213-8274-47bd-b6f4-80f2d465275c" (UID: "143cf213-8274-47bd-b6f4-80f2d465275c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.869803 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "143cf213-8274-47bd-b6f4-80f2d465275c" (UID: "143cf213-8274-47bd-b6f4-80f2d465275c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.926173 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.926218 4697 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.926232 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/143cf213-8274-47bd-b6f4-80f2d465275c-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 16:56:22 crc kubenswrapper[4697]: I0220 16:56:22.926245 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6chxj\" (UniqueName: \"kubernetes.io/projected/143cf213-8274-47bd-b6f4-80f2d465275c-kube-api-access-6chxj\") on node \"crc\" DevicePath \"\"" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.125602 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" event={"ID":"143cf213-8274-47bd-b6f4-80f2d465275c","Type":"ContainerDied","Data":"5b5322bf04ad5afc4101393a2973ae830d2323f41f2991bc06fab00b3ae578a4"} Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.125641 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b5322bf04ad5afc4101393a2973ae830d2323f41f2991bc06fab00b3ae578a4" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.125662 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.231953 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2"] Feb 20 16:56:23 crc kubenswrapper[4697]: E0220 16:56:23.232396 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143cf213-8274-47bd-b6f4-80f2d465275c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.232415 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="143cf213-8274-47bd-b6f4-80f2d465275c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.232642 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="143cf213-8274-47bd-b6f4-80f2d465275c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.233266 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.240883 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.241640 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.241886 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.242048 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.253012 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2"] Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.334766 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbn9h\" (UniqueName: \"kubernetes.io/projected/603c4295-1159-4a8b-856f-c40cb2a0838c-kube-api-access-fbn9h\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gp4g2\" (UID: \"603c4295-1159-4a8b-856f-c40cb2a0838c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.334924 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603c4295-1159-4a8b-856f-c40cb2a0838c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gp4g2\" (UID: \"603c4295-1159-4a8b-856f-c40cb2a0838c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.334998 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/603c4295-1159-4a8b-856f-c40cb2a0838c-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gp4g2\" (UID: \"603c4295-1159-4a8b-856f-c40cb2a0838c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.436454 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbn9h\" (UniqueName: \"kubernetes.io/projected/603c4295-1159-4a8b-856f-c40cb2a0838c-kube-api-access-fbn9h\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gp4g2\" (UID: \"603c4295-1159-4a8b-856f-c40cb2a0838c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.436605 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603c4295-1159-4a8b-856f-c40cb2a0838c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gp4g2\" (UID: \"603c4295-1159-4a8b-856f-c40cb2a0838c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.436671 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/603c4295-1159-4a8b-856f-c40cb2a0838c-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gp4g2\" (UID: \"603c4295-1159-4a8b-856f-c40cb2a0838c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.440453 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/603c4295-1159-4a8b-856f-c40cb2a0838c-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gp4g2\" (UID: \"603c4295-1159-4a8b-856f-c40cb2a0838c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.440833 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603c4295-1159-4a8b-856f-c40cb2a0838c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gp4g2\" (UID: \"603c4295-1159-4a8b-856f-c40cb2a0838c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.456765 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbn9h\" (UniqueName: \"kubernetes.io/projected/603c4295-1159-4a8b-856f-c40cb2a0838c-kube-api-access-fbn9h\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gp4g2\" (UID: \"603c4295-1159-4a8b-856f-c40cb2a0838c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:23 crc kubenswrapper[4697]: I0220 16:56:23.557619 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:24 crc kubenswrapper[4697]: I0220 16:56:24.139703 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2"] Feb 20 16:56:25 crc kubenswrapper[4697]: I0220 16:56:25.151860 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" event={"ID":"603c4295-1159-4a8b-856f-c40cb2a0838c","Type":"ContainerStarted","Data":"a67bd53dbaf6d6017ed771030a1dddfe9558ead27568b4c08eb257e5504dbe9f"} Feb 20 16:56:25 crc kubenswrapper[4697]: I0220 16:56:25.152171 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" event={"ID":"603c4295-1159-4a8b-856f-c40cb2a0838c","Type":"ContainerStarted","Data":"0e03db6c1da40b6118d04a187b7a87c33317ec12fc2c325ee6dccc4ea883bd52"} Feb 20 16:56:25 crc kubenswrapper[4697]: I0220 16:56:25.177085 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" podStartSLOduration=1.771023702 podStartE2EDuration="2.17706222s" podCreationTimestamp="2026-02-20 16:56:23 +0000 UTC" firstStartedPulling="2026-02-20 16:56:24.15406784 +0000 UTC m=+1491.934113248" lastFinishedPulling="2026-02-20 16:56:24.560106308 +0000 UTC m=+1492.340151766" observedRunningTime="2026-02-20 16:56:25.169724354 +0000 UTC m=+1492.949769782" watchObservedRunningTime="2026-02-20 16:56:25.17706222 +0000 UTC m=+1492.957107638" Feb 20 16:56:28 crc kubenswrapper[4697]: I0220 16:56:28.192215 4697 generic.go:334] "Generic (PLEG): container finished" podID="603c4295-1159-4a8b-856f-c40cb2a0838c" containerID="a67bd53dbaf6d6017ed771030a1dddfe9558ead27568b4c08eb257e5504dbe9f" exitCode=0 Feb 20 16:56:28 crc kubenswrapper[4697]: I0220 16:56:28.192369 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" event={"ID":"603c4295-1159-4a8b-856f-c40cb2a0838c","Type":"ContainerDied","Data":"a67bd53dbaf6d6017ed771030a1dddfe9558ead27568b4c08eb257e5504dbe9f"} Feb 20 16:56:29 crc kubenswrapper[4697]: I0220 16:56:29.590297 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:29 crc kubenswrapper[4697]: I0220 16:56:29.670051 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603c4295-1159-4a8b-856f-c40cb2a0838c-inventory\") pod \"603c4295-1159-4a8b-856f-c40cb2a0838c\" (UID: \"603c4295-1159-4a8b-856f-c40cb2a0838c\") " Feb 20 16:56:29 crc kubenswrapper[4697]: I0220 16:56:29.670196 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbn9h\" (UniqueName: \"kubernetes.io/projected/603c4295-1159-4a8b-856f-c40cb2a0838c-kube-api-access-fbn9h\") pod \"603c4295-1159-4a8b-856f-c40cb2a0838c\" (UID: \"603c4295-1159-4a8b-856f-c40cb2a0838c\") " Feb 20 16:56:29 crc kubenswrapper[4697]: I0220 16:56:29.670288 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/603c4295-1159-4a8b-856f-c40cb2a0838c-ssh-key-openstack-edpm-ipam\") pod \"603c4295-1159-4a8b-856f-c40cb2a0838c\" (UID: \"603c4295-1159-4a8b-856f-c40cb2a0838c\") " Feb 20 16:56:29 crc kubenswrapper[4697]: I0220 16:56:29.675993 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603c4295-1159-4a8b-856f-c40cb2a0838c-kube-api-access-fbn9h" (OuterVolumeSpecName: "kube-api-access-fbn9h") pod "603c4295-1159-4a8b-856f-c40cb2a0838c" (UID: "603c4295-1159-4a8b-856f-c40cb2a0838c"). InnerVolumeSpecName "kube-api-access-fbn9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:56:29 crc kubenswrapper[4697]: I0220 16:56:29.698063 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603c4295-1159-4a8b-856f-c40cb2a0838c-inventory" (OuterVolumeSpecName: "inventory") pod "603c4295-1159-4a8b-856f-c40cb2a0838c" (UID: "603c4295-1159-4a8b-856f-c40cb2a0838c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:56:29 crc kubenswrapper[4697]: I0220 16:56:29.707730 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603c4295-1159-4a8b-856f-c40cb2a0838c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "603c4295-1159-4a8b-856f-c40cb2a0838c" (UID: "603c4295-1159-4a8b-856f-c40cb2a0838c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:56:29 crc kubenswrapper[4697]: I0220 16:56:29.772897 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/603c4295-1159-4a8b-856f-c40cb2a0838c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 16:56:29 crc kubenswrapper[4697]: I0220 16:56:29.772949 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603c4295-1159-4a8b-856f-c40cb2a0838c-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 16:56:29 crc kubenswrapper[4697]: I0220 16:56:29.772964 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbn9h\" (UniqueName: \"kubernetes.io/projected/603c4295-1159-4a8b-856f-c40cb2a0838c-kube-api-access-fbn9h\") on node \"crc\" DevicePath \"\"" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.237922 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" event={"ID":"603c4295-1159-4a8b-856f-c40cb2a0838c","Type":"ContainerDied","Data":"0e03db6c1da40b6118d04a187b7a87c33317ec12fc2c325ee6dccc4ea883bd52"} Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.237989 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e03db6c1da40b6118d04a187b7a87c33317ec12fc2c325ee6dccc4ea883bd52" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.238115 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gp4g2" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.310447 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj"] Feb 20 16:56:30 crc kubenswrapper[4697]: E0220 16:56:30.310981 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603c4295-1159-4a8b-856f-c40cb2a0838c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.310999 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="603c4295-1159-4a8b-856f-c40cb2a0838c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.311283 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="603c4295-1159-4a8b-856f-c40cb2a0838c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.312119 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.315281 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.317249 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.317593 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.317913 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.325237 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj"] Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.384358 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7dn7\" (UniqueName: \"kubernetes.io/projected/3343ad5b-476c-4e27-a5f7-e7948d8eed62-kube-api-access-b7dn7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.384421 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.384842 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.384899 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.486536 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.486740 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.486778 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.486851 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7dn7\" (UniqueName: \"kubernetes.io/projected/3343ad5b-476c-4e27-a5f7-e7948d8eed62-kube-api-access-b7dn7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.491319 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.492117 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.493936 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.509806 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7dn7\" (UniqueName: \"kubernetes.io/projected/3343ad5b-476c-4e27-a5f7-e7948d8eed62-kube-api-access-b7dn7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:30 crc kubenswrapper[4697]: I0220 16:56:30.667859 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:56:31 crc kubenswrapper[4697]: I0220 16:56:31.185406 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 16:56:31 crc kubenswrapper[4697]: I0220 16:56:31.185924 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 16:56:31 crc kubenswrapper[4697]: I0220 16:56:31.186010 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 16:56:31 crc kubenswrapper[4697]: I0220 16:56:31.187165 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 16:56:31 crc kubenswrapper[4697]: I0220 16:56:31.187274 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" gracePeriod=600 Feb 20 16:56:31 crc kubenswrapper[4697]: I0220 16:56:31.208897 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj"] Feb 20 16:56:31 crc kubenswrapper[4697]: I0220 16:56:31.268793 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" event={"ID":"3343ad5b-476c-4e27-a5f7-e7948d8eed62","Type":"ContainerStarted","Data":"4adb370bf2bc6d4b95fdc51354a4827799bc98e41c2ecfbd1117667a3679cef8"} Feb 20 16:56:31 crc kubenswrapper[4697]: E0220 16:56:31.312733 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:56:32 crc kubenswrapper[4697]: I0220 16:56:32.282329 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" event={"ID":"3343ad5b-476c-4e27-a5f7-e7948d8eed62","Type":"ContainerStarted","Data":"83707b2ce320aa58806ff41e56861ffef038464c3200fc22bff60fc0bae49ca3"} Feb 20 16:56:32 crc kubenswrapper[4697]: I0220 16:56:32.287566 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" exitCode=0 Feb 20 16:56:32 crc kubenswrapper[4697]: I0220 16:56:32.287621 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382"} Feb 20 16:56:32 crc kubenswrapper[4697]: I0220 16:56:32.287658 4697 scope.go:117] "RemoveContainer" containerID="18b63bf23bfaf1519d7cd30ced77d1ca85c9b60f2bab25b83e9d358614cbd28d" Feb 20 16:56:32 crc kubenswrapper[4697]: I0220 16:56:32.288528 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:56:32 crc kubenswrapper[4697]: E0220 16:56:32.289229 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:56:32 crc kubenswrapper[4697]: I0220 16:56:32.315831 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" podStartSLOduration=1.886705222 podStartE2EDuration="2.315802053s" podCreationTimestamp="2026-02-20 16:56:30 +0000 UTC" firstStartedPulling="2026-02-20 16:56:31.221483681 +0000 UTC m=+1499.001529089" lastFinishedPulling="2026-02-20 16:56:31.650580512 +0000 UTC m=+1499.430625920" observedRunningTime="2026-02-20 16:56:32.304025491 +0000 UTC m=+1500.084070909" watchObservedRunningTime="2026-02-20 16:56:32.315802053 +0000 UTC m=+1500.095847471" Feb 20 16:56:43 crc kubenswrapper[4697]: I0220 16:56:43.368578 4697 scope.go:117] "RemoveContainer" containerID="8e53ff167cd9abe968c8a185ef948cbf23e017beb3536675e043146d957b5ff9" Feb 20 16:56:43 crc kubenswrapper[4697]: I0220 16:56:43.396454 4697 scope.go:117] "RemoveContainer" containerID="025f2c983bd43551b0e9c62d8fa67bc40b843ab775b826c03b5058a2e419c58c" Feb 20 16:56:43 crc kubenswrapper[4697]: I0220 16:56:43.461166 4697 scope.go:117] "RemoveContainer" containerID="6ddc72e4e05a1c71fab5df4cf5556bcf58783c78bdbb538457733b21d00a56f3" Feb 20 16:56:46 crc kubenswrapper[4697]: I0220 16:56:46.877480 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:56:46 crc kubenswrapper[4697]: E0220 16:56:46.878366 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:57:00 crc kubenswrapper[4697]: I0220 16:57:00.877099 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:57:00 crc kubenswrapper[4697]: E0220 16:57:00.878066 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:57:13 crc kubenswrapper[4697]: I0220 16:57:13.877525 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:57:13 crc kubenswrapper[4697]: E0220 16:57:13.878353 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:57:26 crc kubenswrapper[4697]: I0220 16:57:26.877562 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:57:26 crc kubenswrapper[4697]: E0220 16:57:26.878610 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:57:40 crc kubenswrapper[4697]: I0220 16:57:40.877098 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:57:40 crc kubenswrapper[4697]: E0220 16:57:40.878766 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:57:43 crc kubenswrapper[4697]: I0220 16:57:43.573153 4697 scope.go:117] "RemoveContainer" containerID="2792f05136401783af589827b871e78e478603352525ff27d3ed9941cd730bfd" Feb 20 16:57:54 crc kubenswrapper[4697]: I0220 16:57:54.877127 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:57:54 crc kubenswrapper[4697]: E0220 16:57:54.878054 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:58:08 crc kubenswrapper[4697]: I0220 16:58:08.877813 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:58:08 crc kubenswrapper[4697]: E0220 16:58:08.880089 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:58:22 crc kubenswrapper[4697]: I0220 16:58:22.886218 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:58:22 crc kubenswrapper[4697]: E0220 16:58:22.887109 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:58:33 crc kubenswrapper[4697]: I0220 16:58:33.878072 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:58:33 crc kubenswrapper[4697]: E0220 16:58:33.878852 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:58:45 crc kubenswrapper[4697]: I0220 16:58:45.877091 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:58:45 crc kubenswrapper[4697]: E0220 16:58:45.878112 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:59:00 crc kubenswrapper[4697]: I0220 16:59:00.878520 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:59:00 crc kubenswrapper[4697]: E0220 16:59:00.879337 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:59:11 crc kubenswrapper[4697]: I0220 16:59:11.876727 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:59:11 crc kubenswrapper[4697]: E0220 16:59:11.878818 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:59:24 crc kubenswrapper[4697]: I0220 16:59:24.877012 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:59:24 crc kubenswrapper[4697]: E0220 16:59:24.877888 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:59:29 crc kubenswrapper[4697]: I0220 16:59:29.044588 4697 generic.go:334] "Generic (PLEG): container finished" podID="3343ad5b-476c-4e27-a5f7-e7948d8eed62" containerID="83707b2ce320aa58806ff41e56861ffef038464c3200fc22bff60fc0bae49ca3" exitCode=0 Feb 20 16:59:29 crc kubenswrapper[4697]: I0220 16:59:29.045159 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" event={"ID":"3343ad5b-476c-4e27-a5f7-e7948d8eed62","Type":"ContainerDied","Data":"83707b2ce320aa58806ff41e56861ffef038464c3200fc22bff60fc0bae49ca3"} Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.442768 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.605616 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7dn7\" (UniqueName: \"kubernetes.io/projected/3343ad5b-476c-4e27-a5f7-e7948d8eed62-kube-api-access-b7dn7\") pod \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.606037 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-inventory\") pod \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.606091 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-ssh-key-openstack-edpm-ipam\") pod \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.606155 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-bootstrap-combined-ca-bundle\") pod \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\" (UID: \"3343ad5b-476c-4e27-a5f7-e7948d8eed62\") " Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.610910 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3343ad5b-476c-4e27-a5f7-e7948d8eed62" (UID: "3343ad5b-476c-4e27-a5f7-e7948d8eed62"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.622010 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3343ad5b-476c-4e27-a5f7-e7948d8eed62-kube-api-access-b7dn7" (OuterVolumeSpecName: "kube-api-access-b7dn7") pod "3343ad5b-476c-4e27-a5f7-e7948d8eed62" (UID: "3343ad5b-476c-4e27-a5f7-e7948d8eed62"). InnerVolumeSpecName "kube-api-access-b7dn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.643278 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-inventory" (OuterVolumeSpecName: "inventory") pod "3343ad5b-476c-4e27-a5f7-e7948d8eed62" (UID: "3343ad5b-476c-4e27-a5f7-e7948d8eed62"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.644604 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3343ad5b-476c-4e27-a5f7-e7948d8eed62" (UID: "3343ad5b-476c-4e27-a5f7-e7948d8eed62"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.710034 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7dn7\" (UniqueName: \"kubernetes.io/projected/3343ad5b-476c-4e27-a5f7-e7948d8eed62-kube-api-access-b7dn7\") on node \"crc\" DevicePath \"\"" Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.710075 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.710091 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 16:59:30 crc kubenswrapper[4697]: I0220 16:59:30.710104 4697 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3343ad5b-476c-4e27-a5f7-e7948d8eed62-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.065889 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" event={"ID":"3343ad5b-476c-4e27-a5f7-e7948d8eed62","Type":"ContainerDied","Data":"4adb370bf2bc6d4b95fdc51354a4827799bc98e41c2ecfbd1117667a3679cef8"} Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.066139 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4adb370bf2bc6d4b95fdc51354a4827799bc98e41c2ecfbd1117667a3679cef8" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.066300 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.162336 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6"] Feb 20 16:59:31 crc kubenswrapper[4697]: E0220 16:59:31.162876 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3343ad5b-476c-4e27-a5f7-e7948d8eed62" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.162899 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="3343ad5b-476c-4e27-a5f7-e7948d8eed62" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.163078 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="3343ad5b-476c-4e27-a5f7-e7948d8eed62" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.163760 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.165701 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.165836 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.166882 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.173146 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.197905 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6"] Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.225597 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6\" (UID: \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.225951 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hglqq\" (UniqueName: \"kubernetes.io/projected/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-kube-api-access-hglqq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6\" (UID: \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.226085 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6\" (UID: \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.328309 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hglqq\" (UniqueName: \"kubernetes.io/projected/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-kube-api-access-hglqq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6\" (UID: \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.328396 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6\" (UID: \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.328487 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6\" (UID: \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.332286 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6\" (UID: \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.333746 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6\" (UID: \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.345125 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hglqq\" (UniqueName: \"kubernetes.io/projected/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-kube-api-access-hglqq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6\" (UID: \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 16:59:31 crc kubenswrapper[4697]: I0220 16:59:31.494196 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 16:59:32 crc kubenswrapper[4697]: I0220 16:59:31.997607 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6"] Feb 20 16:59:32 crc kubenswrapper[4697]: I0220 16:59:32.001267 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 16:59:32 crc kubenswrapper[4697]: I0220 16:59:32.076047 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" event={"ID":"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b","Type":"ContainerStarted","Data":"a1aae0be6455ecce5d4338edbf2881299dc6b52e77b9e21c3f010aeffd3c6df1"} Feb 20 16:59:33 crc kubenswrapper[4697]: I0220 16:59:33.086753 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" event={"ID":"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b","Type":"ContainerStarted","Data":"11d07b69cc74e250973d8d6ced1c60be6b316a1dadb290e7a515e573f0503624"} Feb 20 16:59:33 crc kubenswrapper[4697]: I0220 16:59:33.108252 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" podStartSLOduration=1.4215855419999999 podStartE2EDuration="2.108231208s" podCreationTimestamp="2026-02-20 16:59:31 +0000 UTC" firstStartedPulling="2026-02-20 16:59:32.001006098 +0000 UTC m=+1679.781051516" lastFinishedPulling="2026-02-20 16:59:32.687651774 +0000 UTC m=+1680.467697182" observedRunningTime="2026-02-20 16:59:33.100261563 +0000 UTC m=+1680.880306971" watchObservedRunningTime="2026-02-20 16:59:33.108231208 +0000 UTC m=+1680.888276616" Feb 20 16:59:38 crc kubenswrapper[4697]: I0220 16:59:38.877807 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:59:38 crc kubenswrapper[4697]: E0220 16:59:38.878364 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 16:59:43 crc kubenswrapper[4697]: I0220 16:59:43.680714 4697 scope.go:117] "RemoveContainer" containerID="ff3885dc951d6b4c46150938d38477867fb56f3443262b2fdc9b8448f5e33595" Feb 20 16:59:43 crc kubenswrapper[4697]: I0220 16:59:43.706035 4697 scope.go:117] "RemoveContainer" containerID="cc4d4ee346506e8d7c5d7bd173bdccf98fc8714174809e7252e1bd87d0d0d276" Feb 20 16:59:43 crc kubenswrapper[4697]: I0220 16:59:43.731239 4697 scope.go:117] "RemoveContainer" containerID="ce4be8338fc3f288907259e62015d0c33d03e1eb65beb628ba9665f6c8ca85f4" Feb 20 16:59:43 crc kubenswrapper[4697]: I0220 16:59:43.756719 4697 scope.go:117] "RemoveContainer" containerID="a85e2d0cac5a29e1de828658fb8697a39f10b0604526427faa1f373616a8c950" Feb 20 16:59:43 crc kubenswrapper[4697]: I0220 16:59:43.778550 4697 scope.go:117] "RemoveContainer" containerID="2546ee47d878476dadc4177e8cb965174cf823b59bc79a2b8eb38fd123cb032a" Feb 20 16:59:43 crc kubenswrapper[4697]: I0220 16:59:43.800104 4697 scope.go:117] "RemoveContainer" containerID="a09dd56597828e773d401594c72aa03914026cc5879c843c41a59bfa76e5b55d" Feb 20 16:59:52 crc kubenswrapper[4697]: I0220 16:59:52.884064 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 16:59:52 crc kubenswrapper[4697]: E0220 16:59:52.884877 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.159184 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm"] Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.164943 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.190150 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm"] Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.203903 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.206597 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.236066 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c561e953-8a30-4fc8-b054-d4d79cec06be-config-volume\") pod \"collect-profiles-29526780-zbftm\" (UID: \"c561e953-8a30-4fc8-b054-d4d79cec06be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.236133 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c561e953-8a30-4fc8-b054-d4d79cec06be-secret-volume\") pod \"collect-profiles-29526780-zbftm\" (UID: \"c561e953-8a30-4fc8-b054-d4d79cec06be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.236609 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmtx4\" (UniqueName: \"kubernetes.io/projected/c561e953-8a30-4fc8-b054-d4d79cec06be-kube-api-access-dmtx4\") pod \"collect-profiles-29526780-zbftm\" (UID: \"c561e953-8a30-4fc8-b054-d4d79cec06be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.338922 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmtx4\" (UniqueName: \"kubernetes.io/projected/c561e953-8a30-4fc8-b054-d4d79cec06be-kube-api-access-dmtx4\") pod \"collect-profiles-29526780-zbftm\" (UID: \"c561e953-8a30-4fc8-b054-d4d79cec06be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.339002 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c561e953-8a30-4fc8-b054-d4d79cec06be-config-volume\") pod \"collect-profiles-29526780-zbftm\" (UID: \"c561e953-8a30-4fc8-b054-d4d79cec06be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.339049 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c561e953-8a30-4fc8-b054-d4d79cec06be-secret-volume\") pod \"collect-profiles-29526780-zbftm\" (UID: \"c561e953-8a30-4fc8-b054-d4d79cec06be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.339795 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c561e953-8a30-4fc8-b054-d4d79cec06be-config-volume\") pod \"collect-profiles-29526780-zbftm\" (UID: \"c561e953-8a30-4fc8-b054-d4d79cec06be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.346019 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c561e953-8a30-4fc8-b054-d4d79cec06be-secret-volume\") pod \"collect-profiles-29526780-zbftm\" (UID: \"c561e953-8a30-4fc8-b054-d4d79cec06be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.356328 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmtx4\" (UniqueName: \"kubernetes.io/projected/c561e953-8a30-4fc8-b054-d4d79cec06be-kube-api-access-dmtx4\") pod \"collect-profiles-29526780-zbftm\" (UID: \"c561e953-8a30-4fc8-b054-d4d79cec06be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.531662 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:00 crc kubenswrapper[4697]: I0220 17:00:00.955927 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm"] Feb 20 17:00:01 crc kubenswrapper[4697]: I0220 17:00:01.345067 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" event={"ID":"c561e953-8a30-4fc8-b054-d4d79cec06be","Type":"ContainerStarted","Data":"9d845d7cd5adac4bc5945aea3fb2bbab479433d0c129e31f7c5f8f92db2d382b"} Feb 20 17:00:01 crc kubenswrapper[4697]: I0220 17:00:01.345117 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" event={"ID":"c561e953-8a30-4fc8-b054-d4d79cec06be","Type":"ContainerStarted","Data":"78eb364d5da7d3a8b34ea13d678e5e00f5900cff3160c3b8c2fdb795a75b7786"} Feb 20 17:00:01 crc kubenswrapper[4697]: I0220 17:00:01.363055 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" podStartSLOduration=1.363032668 podStartE2EDuration="1.363032668s" podCreationTimestamp="2026-02-20 17:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 17:00:01.361705655 +0000 UTC m=+1709.141751063" watchObservedRunningTime="2026-02-20 17:00:01.363032668 +0000 UTC m=+1709.143078086" Feb 20 17:00:02 crc kubenswrapper[4697]: I0220 17:00:02.358649 4697 generic.go:334] "Generic (PLEG): container finished" podID="c561e953-8a30-4fc8-b054-d4d79cec06be" containerID="9d845d7cd5adac4bc5945aea3fb2bbab479433d0c129e31f7c5f8f92db2d382b" exitCode=0 Feb 20 17:00:02 crc kubenswrapper[4697]: I0220 17:00:02.359017 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" event={"ID":"c561e953-8a30-4fc8-b054-d4d79cec06be","Type":"ContainerDied","Data":"9d845d7cd5adac4bc5945aea3fb2bbab479433d0c129e31f7c5f8f92db2d382b"} Feb 20 17:00:03 crc kubenswrapper[4697]: I0220 17:00:03.767631 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:03 crc kubenswrapper[4697]: I0220 17:00:03.816015 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmtx4\" (UniqueName: \"kubernetes.io/projected/c561e953-8a30-4fc8-b054-d4d79cec06be-kube-api-access-dmtx4\") pod \"c561e953-8a30-4fc8-b054-d4d79cec06be\" (UID: \"c561e953-8a30-4fc8-b054-d4d79cec06be\") " Feb 20 17:00:03 crc kubenswrapper[4697]: I0220 17:00:03.816272 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c561e953-8a30-4fc8-b054-d4d79cec06be-secret-volume\") pod \"c561e953-8a30-4fc8-b054-d4d79cec06be\" (UID: \"c561e953-8a30-4fc8-b054-d4d79cec06be\") " Feb 20 17:00:03 crc kubenswrapper[4697]: I0220 17:00:03.816364 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c561e953-8a30-4fc8-b054-d4d79cec06be-config-volume\") pod \"c561e953-8a30-4fc8-b054-d4d79cec06be\" (UID: \"c561e953-8a30-4fc8-b054-d4d79cec06be\") " Feb 20 17:00:03 crc kubenswrapper[4697]: I0220 17:00:03.817342 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c561e953-8a30-4fc8-b054-d4d79cec06be-config-volume" (OuterVolumeSpecName: "config-volume") pod "c561e953-8a30-4fc8-b054-d4d79cec06be" (UID: "c561e953-8a30-4fc8-b054-d4d79cec06be"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 17:00:03 crc kubenswrapper[4697]: I0220 17:00:03.823671 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c561e953-8a30-4fc8-b054-d4d79cec06be-kube-api-access-dmtx4" (OuterVolumeSpecName: "kube-api-access-dmtx4") pod "c561e953-8a30-4fc8-b054-d4d79cec06be" (UID: "c561e953-8a30-4fc8-b054-d4d79cec06be"). InnerVolumeSpecName "kube-api-access-dmtx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:00:03 crc kubenswrapper[4697]: I0220 17:00:03.824365 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c561e953-8a30-4fc8-b054-d4d79cec06be-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c561e953-8a30-4fc8-b054-d4d79cec06be" (UID: "c561e953-8a30-4fc8-b054-d4d79cec06be"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:00:03 crc kubenswrapper[4697]: I0220 17:00:03.918882 4697 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c561e953-8a30-4fc8-b054-d4d79cec06be-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 17:00:03 crc kubenswrapper[4697]: I0220 17:00:03.918911 4697 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c561e953-8a30-4fc8-b054-d4d79cec06be-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 17:00:03 crc kubenswrapper[4697]: I0220 17:00:03.918924 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmtx4\" (UniqueName: \"kubernetes.io/projected/c561e953-8a30-4fc8-b054-d4d79cec06be-kube-api-access-dmtx4\") on node \"crc\" DevicePath \"\"" Feb 20 17:00:04 crc kubenswrapper[4697]: I0220 17:00:04.408502 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" event={"ID":"c561e953-8a30-4fc8-b054-d4d79cec06be","Type":"ContainerDied","Data":"78eb364d5da7d3a8b34ea13d678e5e00f5900cff3160c3b8c2fdb795a75b7786"} Feb 20 17:00:04 crc kubenswrapper[4697]: I0220 17:00:04.408542 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78eb364d5da7d3a8b34ea13d678e5e00f5900cff3160c3b8c2fdb795a75b7786" Feb 20 17:00:04 crc kubenswrapper[4697]: I0220 17:00:04.408594 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm" Feb 20 17:00:07 crc kubenswrapper[4697]: I0220 17:00:07.877688 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 17:00:07 crc kubenswrapper[4697]: E0220 17:00:07.878978 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.054530 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-fm2pl"] Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.066105 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0d09-account-create-update-d99d8"] Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.081821 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-dzvcx"] Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.090724 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-8382-account-create-update-fnxqt"] Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.099687 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-f27ln"] Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.108213 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-8382-account-create-update-fnxqt"] Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.115645 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e609-account-create-update-r9q9h"] Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.123508 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0d09-account-create-update-d99d8"] Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.131428 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e609-account-create-update-r9q9h"] Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.139966 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-f27ln"] Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.149595 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-dzvcx"] Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.158201 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-fm2pl"] Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.886847 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="222d06f8-6f3e-45ce-9acd-18c6d624edf4" path="/var/lib/kubelet/pods/222d06f8-6f3e-45ce-9acd-18c6d624edf4/volumes" Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.887369 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c5e0bd-8650-46b9-a189-72bf5090b0f7" path="/var/lib/kubelet/pods/36c5e0bd-8650-46b9-a189-72bf5090b0f7/volumes" Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.887948 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bdc7f7e-de85-48cb-b75d-969ff2d39d14" path="/var/lib/kubelet/pods/4bdc7f7e-de85-48cb-b75d-969ff2d39d14/volumes" Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.888480 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a72078d9-fd39-496f-b41d-ffb493c9bc14" path="/var/lib/kubelet/pods/a72078d9-fd39-496f-b41d-ffb493c9bc14/volumes" Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.889464 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f66cbebc-f05e-4503-9e6e-8877ea8904fb" path="/var/lib/kubelet/pods/f66cbebc-f05e-4503-9e6e-8877ea8904fb/volumes" Feb 20 17:00:16 crc kubenswrapper[4697]: I0220 17:00:16.890776 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93ead78-caf8-4bda-a0be-ef377041fb5a" path="/var/lib/kubelet/pods/f93ead78-caf8-4bda-a0be-ef377041fb5a/volumes" Feb 20 17:00:17 crc kubenswrapper[4697]: I0220 17:00:17.050953 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2lhlv"] Feb 20 17:00:17 crc kubenswrapper[4697]: I0220 17:00:17.062887 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a1b1-account-create-update-2w7nj"] Feb 20 17:00:17 crc kubenswrapper[4697]: I0220 17:00:17.072137 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a1b1-account-create-update-2w7nj"] Feb 20 17:00:17 crc kubenswrapper[4697]: I0220 17:00:17.079993 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2lhlv"] Feb 20 17:00:18 crc kubenswrapper[4697]: I0220 17:00:18.877888 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 17:00:18 crc kubenswrapper[4697]: E0220 17:00:18.878860 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:00:18 crc kubenswrapper[4697]: I0220 17:00:18.890751 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1be73ad-831a-4fab-9b34-646e75b34e01" path="/var/lib/kubelet/pods/b1be73ad-831a-4fab-9b34-646e75b34e01/volumes" Feb 20 17:00:18 crc kubenswrapper[4697]: I0220 17:00:18.892819 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde77bd7-58d5-4213-9793-7f8549cb4ff5" path="/var/lib/kubelet/pods/dde77bd7-58d5-4213-9793-7f8549cb4ff5/volumes" Feb 20 17:00:30 crc kubenswrapper[4697]: I0220 17:00:30.879473 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 17:00:30 crc kubenswrapper[4697]: E0220 17:00:30.881312 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:00:36 crc kubenswrapper[4697]: I0220 17:00:36.046025 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mxxjc"] Feb 20 17:00:36 crc kubenswrapper[4697]: I0220 17:00:36.059815 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-mxxjc"] Feb 20 17:00:36 crc kubenswrapper[4697]: I0220 17:00:36.887374 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e7c55b-4906-4b63-94ac-5503110f6c8f" path="/var/lib/kubelet/pods/28e7c55b-4906-4b63-94ac-5503110f6c8f/volumes" Feb 20 17:00:43 crc kubenswrapper[4697]: I0220 17:00:43.889116 4697 scope.go:117] "RemoveContainer" containerID="a9055d73a8156fed58aa4f45342897bca77336c25719dff76e5cde28821f946b" Feb 20 17:00:43 crc kubenswrapper[4697]: I0220 17:00:43.924684 4697 scope.go:117] "RemoveContainer" containerID="9de9d4761129d68c8bd03e71ca2487c9f0cf1fbe0609307192957117cddcdfe9" Feb 20 17:00:43 crc kubenswrapper[4697]: I0220 17:00:43.997301 4697 scope.go:117] "RemoveContainer" containerID="08f209d6039c536d1f4a314b16d3302cdb150f865b94fe9edde4ddbcae026cea" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.048210 4697 scope.go:117] "RemoveContainer" containerID="769a09d472a35b799718765a4a9a09b190b50e8a33ffc695bb527e5467b54067" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.104836 4697 scope.go:117] "RemoveContainer" containerID="ea5ac79f7997f03e2b75422aadf6790db8fa562e94beabc4cac6e060d02aa0f3" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.164391 4697 scope.go:117] "RemoveContainer" containerID="c94a65cd6dc725cf5daf669470a052a9e002cfc0db79a402ec0bebdf8a7c5a0d" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.214549 4697 scope.go:117] "RemoveContainer" containerID="babe786dd755d89365a5b351ff85e2fa16212923176a0da69ab437d303a8627e" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.238919 4697 scope.go:117] "RemoveContainer" containerID="9291241e7f3e6e433f9d5756c9ccfb27273b0773b3c37ae187be09ed03573bea" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.266276 4697 scope.go:117] "RemoveContainer" containerID="92074cc543c50d5e7f01442d6308c34e68e974cd89665a1bc2ba3e1b5e68c9cf" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.382274 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bw7tq"] Feb 20 17:00:44 crc kubenswrapper[4697]: E0220 17:00:44.382801 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c561e953-8a30-4fc8-b054-d4d79cec06be" containerName="collect-profiles" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.382823 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="c561e953-8a30-4fc8-b054-d4d79cec06be" containerName="collect-profiles" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.383098 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="c561e953-8a30-4fc8-b054-d4d79cec06be" containerName="collect-profiles" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.385134 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.386347 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qlm\" (UniqueName: \"kubernetes.io/projected/24349951-c1c3-472c-9ded-2ebd90fcb328-kube-api-access-m5qlm\") pod \"redhat-marketplace-bw7tq\" (UID: \"24349951-c1c3-472c-9ded-2ebd90fcb328\") " pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.386546 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24349951-c1c3-472c-9ded-2ebd90fcb328-catalog-content\") pod \"redhat-marketplace-bw7tq\" (UID: \"24349951-c1c3-472c-9ded-2ebd90fcb328\") " pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.386590 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24349951-c1c3-472c-9ded-2ebd90fcb328-utilities\") pod \"redhat-marketplace-bw7tq\" (UID: \"24349951-c1c3-472c-9ded-2ebd90fcb328\") " pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.397258 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw7tq"] Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.488369 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24349951-c1c3-472c-9ded-2ebd90fcb328-catalog-content\") pod \"redhat-marketplace-bw7tq\" (UID: \"24349951-c1c3-472c-9ded-2ebd90fcb328\") " pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.488423 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24349951-c1c3-472c-9ded-2ebd90fcb328-utilities\") pod \"redhat-marketplace-bw7tq\" (UID: \"24349951-c1c3-472c-9ded-2ebd90fcb328\") " pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.488975 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24349951-c1c3-472c-9ded-2ebd90fcb328-catalog-content\") pod \"redhat-marketplace-bw7tq\" (UID: \"24349951-c1c3-472c-9ded-2ebd90fcb328\") " pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.489000 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24349951-c1c3-472c-9ded-2ebd90fcb328-utilities\") pod \"redhat-marketplace-bw7tq\" (UID: \"24349951-c1c3-472c-9ded-2ebd90fcb328\") " pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.489097 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qlm\" (UniqueName: \"kubernetes.io/projected/24349951-c1c3-472c-9ded-2ebd90fcb328-kube-api-access-m5qlm\") pod \"redhat-marketplace-bw7tq\" (UID: \"24349951-c1c3-472c-9ded-2ebd90fcb328\") " pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.508893 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qlm\" (UniqueName: \"kubernetes.io/projected/24349951-c1c3-472c-9ded-2ebd90fcb328-kube-api-access-m5qlm\") pod \"redhat-marketplace-bw7tq\" (UID: \"24349951-c1c3-472c-9ded-2ebd90fcb328\") " pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.744651 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:44 crc kubenswrapper[4697]: I0220 17:00:44.877329 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 17:00:44 crc kubenswrapper[4697]: E0220 17:00:44.877914 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:00:45 crc kubenswrapper[4697]: I0220 17:00:45.038199 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wtqdz"] Feb 20 17:00:45 crc kubenswrapper[4697]: I0220 17:00:45.049953 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wtqdz"] Feb 20 17:00:45 crc kubenswrapper[4697]: I0220 17:00:45.206160 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw7tq"] Feb 20 17:00:45 crc kubenswrapper[4697]: I0220 17:00:45.845043 4697 generic.go:334] "Generic (PLEG): container finished" podID="24349951-c1c3-472c-9ded-2ebd90fcb328" containerID="51aaab59110420755cc4efd83389e3a36ac578939e1b49df914553c491ca6761" exitCode=0 Feb 20 17:00:45 crc kubenswrapper[4697]: I0220 17:00:45.845116 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw7tq" event={"ID":"24349951-c1c3-472c-9ded-2ebd90fcb328","Type":"ContainerDied","Data":"51aaab59110420755cc4efd83389e3a36ac578939e1b49df914553c491ca6761"} Feb 20 17:00:45 crc kubenswrapper[4697]: I0220 17:00:45.845166 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw7tq" event={"ID":"24349951-c1c3-472c-9ded-2ebd90fcb328","Type":"ContainerStarted","Data":"12fc1a876c93ece90a4681f23e78fdb7c1ff7f85dc1a4d1684a9d0598520625d"} Feb 20 17:00:46 crc kubenswrapper[4697]: I0220 17:00:46.888676 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c949e32-f57d-4f71-aaae-192d3ceea6de" path="/var/lib/kubelet/pods/1c949e32-f57d-4f71-aaae-192d3ceea6de/volumes" Feb 20 17:00:47 crc kubenswrapper[4697]: I0220 17:00:47.865508 4697 generic.go:334] "Generic (PLEG): container finished" podID="24349951-c1c3-472c-9ded-2ebd90fcb328" containerID="c4419b87cc952d0d6c19400521dd4f44a5c4c0ca0620325b2fab4273c89bc97a" exitCode=0 Feb 20 17:00:47 crc kubenswrapper[4697]: I0220 17:00:47.865608 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw7tq" event={"ID":"24349951-c1c3-472c-9ded-2ebd90fcb328","Type":"ContainerDied","Data":"c4419b87cc952d0d6c19400521dd4f44a5c4c0ca0620325b2fab4273c89bc97a"} Feb 20 17:00:48 crc kubenswrapper[4697]: I0220 17:00:48.889253 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw7tq" event={"ID":"24349951-c1c3-472c-9ded-2ebd90fcb328","Type":"ContainerStarted","Data":"55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c"} Feb 20 17:00:48 crc kubenswrapper[4697]: I0220 17:00:48.900805 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bw7tq" podStartSLOduration=2.429168069 podStartE2EDuration="4.900784515s" podCreationTimestamp="2026-02-20 17:00:44 +0000 UTC" firstStartedPulling="2026-02-20 17:00:45.850026708 +0000 UTC m=+1753.630072116" lastFinishedPulling="2026-02-20 17:00:48.321643114 +0000 UTC m=+1756.101688562" observedRunningTime="2026-02-20 17:00:48.895113175 +0000 UTC m=+1756.675158603" watchObservedRunningTime="2026-02-20 17:00:48.900784515 +0000 UTC m=+1756.680829933" Feb 20 17:00:51 crc kubenswrapper[4697]: I0220 17:00:51.048824 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-fd85-account-create-update-c4m4q"] Feb 20 17:00:51 crc kubenswrapper[4697]: I0220 17:00:51.060630 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-lbhl4"] Feb 20 17:00:51 crc kubenswrapper[4697]: I0220 17:00:51.074919 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-hj5zq"] Feb 20 17:00:51 crc kubenswrapper[4697]: I0220 17:00:51.085587 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-lbhl4"] Feb 20 17:00:51 crc kubenswrapper[4697]: I0220 17:00:51.095953 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e438-account-create-update-pwcsf"] Feb 20 17:00:51 crc kubenswrapper[4697]: I0220 17:00:51.103648 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-554gk"] Feb 20 17:00:51 crc kubenswrapper[4697]: I0220 17:00:51.121678 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3eb9-account-create-update-ps2nd"] Feb 20 17:00:51 crc kubenswrapper[4697]: I0220 17:00:51.128749 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-fd85-account-create-update-c4m4q"] Feb 20 17:00:51 crc kubenswrapper[4697]: I0220 17:00:51.136869 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-hj5zq"] Feb 20 17:00:51 crc kubenswrapper[4697]: I0220 17:00:51.144593 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e438-account-create-update-pwcsf"] Feb 20 17:00:51 crc kubenswrapper[4697]: I0220 17:00:51.151680 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-554gk"] Feb 20 17:00:51 crc kubenswrapper[4697]: I0220 17:00:51.159225 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3eb9-account-create-update-ps2nd"] Feb 20 17:00:52 crc kubenswrapper[4697]: I0220 17:00:52.896790 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb" path="/var/lib/kubelet/pods/085ec1b2-25fd-4df4-b5d2-3d9d04ade6bb/volumes" Feb 20 17:00:52 crc kubenswrapper[4697]: I0220 17:00:52.897920 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d2e30bf-2042-4e8d-b465-f1cfe98b2678" path="/var/lib/kubelet/pods/6d2e30bf-2042-4e8d-b465-f1cfe98b2678/volumes" Feb 20 17:00:52 crc kubenswrapper[4697]: I0220 17:00:52.898651 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="785b68ff-7aef-47b3-85d8-587116e8ce3a" path="/var/lib/kubelet/pods/785b68ff-7aef-47b3-85d8-587116e8ce3a/volumes" Feb 20 17:00:52 crc kubenswrapper[4697]: I0220 17:00:52.899372 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87689cb4-4d4f-431a-958e-23385b3fa01d" path="/var/lib/kubelet/pods/87689cb4-4d4f-431a-958e-23385b3fa01d/volumes" Feb 20 17:00:52 crc kubenswrapper[4697]: I0220 17:00:52.900771 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0ca09dc-ef65-4fec-8fea-eb06f96f3818" path="/var/lib/kubelet/pods/c0ca09dc-ef65-4fec-8fea-eb06f96f3818/volumes" Feb 20 17:00:52 crc kubenswrapper[4697]: I0220 17:00:52.901582 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5a353b-3fec-411e-8508-3b440af66824" path="/var/lib/kubelet/pods/df5a353b-3fec-411e-8508-3b440af66824/volumes" Feb 20 17:00:54 crc kubenswrapper[4697]: I0220 17:00:54.745155 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:54 crc kubenswrapper[4697]: I0220 17:00:54.746485 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:54 crc kubenswrapper[4697]: I0220 17:00:54.835530 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:55 crc kubenswrapper[4697]: I0220 17:00:55.010355 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:55 crc kubenswrapper[4697]: I0220 17:00:55.083499 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw7tq"] Feb 20 17:00:55 crc kubenswrapper[4697]: I0220 17:00:55.877739 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 17:00:55 crc kubenswrapper[4697]: E0220 17:00:55.878320 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:00:56 crc kubenswrapper[4697]: I0220 17:00:56.958388 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bw7tq" podUID="24349951-c1c3-472c-9ded-2ebd90fcb328" containerName="registry-server" containerID="cri-o://55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c" gracePeriod=2 Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.451360 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.649398 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5qlm\" (UniqueName: \"kubernetes.io/projected/24349951-c1c3-472c-9ded-2ebd90fcb328-kube-api-access-m5qlm\") pod \"24349951-c1c3-472c-9ded-2ebd90fcb328\" (UID: \"24349951-c1c3-472c-9ded-2ebd90fcb328\") " Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.649557 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24349951-c1c3-472c-9ded-2ebd90fcb328-catalog-content\") pod \"24349951-c1c3-472c-9ded-2ebd90fcb328\" (UID: \"24349951-c1c3-472c-9ded-2ebd90fcb328\") " Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.649673 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24349951-c1c3-472c-9ded-2ebd90fcb328-utilities\") pod \"24349951-c1c3-472c-9ded-2ebd90fcb328\" (UID: \"24349951-c1c3-472c-9ded-2ebd90fcb328\") " Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.650991 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24349951-c1c3-472c-9ded-2ebd90fcb328-utilities" (OuterVolumeSpecName: "utilities") pod "24349951-c1c3-472c-9ded-2ebd90fcb328" (UID: "24349951-c1c3-472c-9ded-2ebd90fcb328"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.655841 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24349951-c1c3-472c-9ded-2ebd90fcb328-kube-api-access-m5qlm" (OuterVolumeSpecName: "kube-api-access-m5qlm") pod "24349951-c1c3-472c-9ded-2ebd90fcb328" (UID: "24349951-c1c3-472c-9ded-2ebd90fcb328"). InnerVolumeSpecName "kube-api-access-m5qlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.684674 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24349951-c1c3-472c-9ded-2ebd90fcb328-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24349951-c1c3-472c-9ded-2ebd90fcb328" (UID: "24349951-c1c3-472c-9ded-2ebd90fcb328"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.752712 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24349951-c1c3-472c-9ded-2ebd90fcb328-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.752752 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24349951-c1c3-472c-9ded-2ebd90fcb328-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.752766 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5qlm\" (UniqueName: \"kubernetes.io/projected/24349951-c1c3-472c-9ded-2ebd90fcb328-kube-api-access-m5qlm\") on node \"crc\" DevicePath \"\"" Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.969634 4697 generic.go:334] "Generic (PLEG): container finished" podID="24349951-c1c3-472c-9ded-2ebd90fcb328" containerID="55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c" exitCode=0 Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.969688 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw7tq" event={"ID":"24349951-c1c3-472c-9ded-2ebd90fcb328","Type":"ContainerDied","Data":"55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c"} Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.969702 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw7tq" Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.969727 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw7tq" event={"ID":"24349951-c1c3-472c-9ded-2ebd90fcb328","Type":"ContainerDied","Data":"12fc1a876c93ece90a4681f23e78fdb7c1ff7f85dc1a4d1684a9d0598520625d"} Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.969752 4697 scope.go:117] "RemoveContainer" containerID="55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c" Feb 20 17:00:57 crc kubenswrapper[4697]: I0220 17:00:57.990152 4697 scope.go:117] "RemoveContainer" containerID="c4419b87cc952d0d6c19400521dd4f44a5c4c0ca0620325b2fab4273c89bc97a" Feb 20 17:00:58 crc kubenswrapper[4697]: I0220 17:00:58.006304 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw7tq"] Feb 20 17:00:58 crc kubenswrapper[4697]: I0220 17:00:58.016449 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw7tq"] Feb 20 17:00:58 crc kubenswrapper[4697]: I0220 17:00:58.020980 4697 scope.go:117] "RemoveContainer" containerID="51aaab59110420755cc4efd83389e3a36ac578939e1b49df914553c491ca6761" Feb 20 17:00:58 crc kubenswrapper[4697]: I0220 17:00:58.067855 4697 scope.go:117] "RemoveContainer" containerID="55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c" Feb 20 17:00:58 crc kubenswrapper[4697]: E0220 17:00:58.068274 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c\": container with ID starting with 55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c not found: ID does not exist" containerID="55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c" Feb 20 17:00:58 crc kubenswrapper[4697]: I0220 17:00:58.068304 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c"} err="failed to get container status \"55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c\": rpc error: code = NotFound desc = could not find container \"55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c\": container with ID starting with 55e4481d3a7f07954b51c16112570fecf306f46fe955ee51f3ff8cad75fee43c not found: ID does not exist" Feb 20 17:00:58 crc kubenswrapper[4697]: I0220 17:00:58.068337 4697 scope.go:117] "RemoveContainer" containerID="c4419b87cc952d0d6c19400521dd4f44a5c4c0ca0620325b2fab4273c89bc97a" Feb 20 17:00:58 crc kubenswrapper[4697]: E0220 17:00:58.068665 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4419b87cc952d0d6c19400521dd4f44a5c4c0ca0620325b2fab4273c89bc97a\": container with ID starting with c4419b87cc952d0d6c19400521dd4f44a5c4c0ca0620325b2fab4273c89bc97a not found: ID does not exist" containerID="c4419b87cc952d0d6c19400521dd4f44a5c4c0ca0620325b2fab4273c89bc97a" Feb 20 17:00:58 crc kubenswrapper[4697]: I0220 17:00:58.068717 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4419b87cc952d0d6c19400521dd4f44a5c4c0ca0620325b2fab4273c89bc97a"} err="failed to get container status \"c4419b87cc952d0d6c19400521dd4f44a5c4c0ca0620325b2fab4273c89bc97a\": rpc error: code = NotFound desc = could not find container \"c4419b87cc952d0d6c19400521dd4f44a5c4c0ca0620325b2fab4273c89bc97a\": container with ID starting with c4419b87cc952d0d6c19400521dd4f44a5c4c0ca0620325b2fab4273c89bc97a not found: ID does not exist" Feb 20 17:00:58 crc kubenswrapper[4697]: I0220 17:00:58.068752 4697 scope.go:117] "RemoveContainer" containerID="51aaab59110420755cc4efd83389e3a36ac578939e1b49df914553c491ca6761" Feb 20 17:00:58 crc kubenswrapper[4697]: E0220 17:00:58.068995 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51aaab59110420755cc4efd83389e3a36ac578939e1b49df914553c491ca6761\": container with ID starting with 51aaab59110420755cc4efd83389e3a36ac578939e1b49df914553c491ca6761 not found: ID does not exist" containerID="51aaab59110420755cc4efd83389e3a36ac578939e1b49df914553c491ca6761" Feb 20 17:00:58 crc kubenswrapper[4697]: I0220 17:00:58.069026 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51aaab59110420755cc4efd83389e3a36ac578939e1b49df914553c491ca6761"} err="failed to get container status \"51aaab59110420755cc4efd83389e3a36ac578939e1b49df914553c491ca6761\": rpc error: code = NotFound desc = could not find container \"51aaab59110420755cc4efd83389e3a36ac578939e1b49df914553c491ca6761\": container with ID starting with 51aaab59110420755cc4efd83389e3a36ac578939e1b49df914553c491ca6761 not found: ID does not exist" Feb 20 17:00:58 crc kubenswrapper[4697]: I0220 17:00:58.913335 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24349951-c1c3-472c-9ded-2ebd90fcb328" path="/var/lib/kubelet/pods/24349951-c1c3-472c-9ded-2ebd90fcb328/volumes" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.218576 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29526781-56g9l"] Feb 20 17:01:00 crc kubenswrapper[4697]: E0220 17:01:00.219553 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24349951-c1c3-472c-9ded-2ebd90fcb328" containerName="extract-utilities" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.219573 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="24349951-c1c3-472c-9ded-2ebd90fcb328" containerName="extract-utilities" Feb 20 17:01:00 crc kubenswrapper[4697]: E0220 17:01:00.219598 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24349951-c1c3-472c-9ded-2ebd90fcb328" containerName="registry-server" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.219606 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="24349951-c1c3-472c-9ded-2ebd90fcb328" containerName="registry-server" Feb 20 17:01:00 crc kubenswrapper[4697]: E0220 17:01:00.219626 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24349951-c1c3-472c-9ded-2ebd90fcb328" containerName="extract-content" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.219646 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="24349951-c1c3-472c-9ded-2ebd90fcb328" containerName="extract-content" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.219913 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="24349951-c1c3-472c-9ded-2ebd90fcb328" containerName="registry-server" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.220793 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.229855 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29526781-56g9l"] Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.420054 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-config-data\") pod \"keystone-cron-29526781-56g9l\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.420138 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9h9w\" (UniqueName: \"kubernetes.io/projected/fb8e173e-11e6-4bc4-a87e-58fd25b53076-kube-api-access-d9h9w\") pod \"keystone-cron-29526781-56g9l\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.420190 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-fernet-keys\") pod \"keystone-cron-29526781-56g9l\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.420242 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-combined-ca-bundle\") pod \"keystone-cron-29526781-56g9l\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.522381 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-config-data\") pod \"keystone-cron-29526781-56g9l\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.522475 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9h9w\" (UniqueName: \"kubernetes.io/projected/fb8e173e-11e6-4bc4-a87e-58fd25b53076-kube-api-access-d9h9w\") pod \"keystone-cron-29526781-56g9l\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.522501 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-fernet-keys\") pod \"keystone-cron-29526781-56g9l\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.522548 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-combined-ca-bundle\") pod \"keystone-cron-29526781-56g9l\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.528525 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-combined-ca-bundle\") pod \"keystone-cron-29526781-56g9l\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.529031 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-config-data\") pod \"keystone-cron-29526781-56g9l\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.529349 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-fernet-keys\") pod \"keystone-cron-29526781-56g9l\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.538327 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9h9w\" (UniqueName: \"kubernetes.io/projected/fb8e173e-11e6-4bc4-a87e-58fd25b53076-kube-api-access-d9h9w\") pod \"keystone-cron-29526781-56g9l\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.543297 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:00 crc kubenswrapper[4697]: I0220 17:01:00.959152 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29526781-56g9l"] Feb 20 17:01:01 crc kubenswrapper[4697]: I0220 17:01:01.000462 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526781-56g9l" event={"ID":"fb8e173e-11e6-4bc4-a87e-58fd25b53076","Type":"ContainerStarted","Data":"0ebd26ddd0e4d6e3509e40113e57b476850f48dcd554e0675817cea22f9f383f"} Feb 20 17:01:02 crc kubenswrapper[4697]: I0220 17:01:02.008678 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526781-56g9l" event={"ID":"fb8e173e-11e6-4bc4-a87e-58fd25b53076","Type":"ContainerStarted","Data":"5d3da4e3f54473aa53f2e5dbd051e0c4349bb840a3de4d688fcfbd5c25000900"} Feb 20 17:01:04 crc kubenswrapper[4697]: I0220 17:01:04.029996 4697 generic.go:334] "Generic (PLEG): container finished" podID="fb8e173e-11e6-4bc4-a87e-58fd25b53076" containerID="5d3da4e3f54473aa53f2e5dbd051e0c4349bb840a3de4d688fcfbd5c25000900" exitCode=0 Feb 20 17:01:04 crc kubenswrapper[4697]: I0220 17:01:04.030030 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526781-56g9l" event={"ID":"fb8e173e-11e6-4bc4-a87e-58fd25b53076","Type":"ContainerDied","Data":"5d3da4e3f54473aa53f2e5dbd051e0c4349bb840a3de4d688fcfbd5c25000900"} Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.380612 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.424158 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9h9w\" (UniqueName: \"kubernetes.io/projected/fb8e173e-11e6-4bc4-a87e-58fd25b53076-kube-api-access-d9h9w\") pod \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.424286 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-fernet-keys\") pod \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.424421 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-combined-ca-bundle\") pod \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.424599 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-config-data\") pod \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\" (UID: \"fb8e173e-11e6-4bc4-a87e-58fd25b53076\") " Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.432211 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8e173e-11e6-4bc4-a87e-58fd25b53076-kube-api-access-d9h9w" (OuterVolumeSpecName: "kube-api-access-d9h9w") pod "fb8e173e-11e6-4bc4-a87e-58fd25b53076" (UID: "fb8e173e-11e6-4bc4-a87e-58fd25b53076"). InnerVolumeSpecName "kube-api-access-d9h9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.433367 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fb8e173e-11e6-4bc4-a87e-58fd25b53076" (UID: "fb8e173e-11e6-4bc4-a87e-58fd25b53076"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.459127 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb8e173e-11e6-4bc4-a87e-58fd25b53076" (UID: "fb8e173e-11e6-4bc4-a87e-58fd25b53076"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.483355 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-config-data" (OuterVolumeSpecName: "config-data") pod "fb8e173e-11e6-4bc4-a87e-58fd25b53076" (UID: "fb8e173e-11e6-4bc4-a87e-58fd25b53076"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.526828 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.526862 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.526873 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9h9w\" (UniqueName: \"kubernetes.io/projected/fb8e173e-11e6-4bc4-a87e-58fd25b53076-kube-api-access-d9h9w\") on node \"crc\" DevicePath \"\"" Feb 20 17:01:05 crc kubenswrapper[4697]: I0220 17:01:05.526884 4697 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb8e173e-11e6-4bc4-a87e-58fd25b53076-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 17:01:06 crc kubenswrapper[4697]: I0220 17:01:06.033413 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-f7c6f"] Feb 20 17:01:06 crc kubenswrapper[4697]: I0220 17:01:06.044480 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-f7c6f"] Feb 20 17:01:06 crc kubenswrapper[4697]: I0220 17:01:06.071148 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526781-56g9l" Feb 20 17:01:06 crc kubenswrapper[4697]: I0220 17:01:06.071139 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526781-56g9l" event={"ID":"fb8e173e-11e6-4bc4-a87e-58fd25b53076","Type":"ContainerDied","Data":"0ebd26ddd0e4d6e3509e40113e57b476850f48dcd554e0675817cea22f9f383f"} Feb 20 17:01:06 crc kubenswrapper[4697]: I0220 17:01:06.071312 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ebd26ddd0e4d6e3509e40113e57b476850f48dcd554e0675817cea22f9f383f" Feb 20 17:01:06 crc kubenswrapper[4697]: I0220 17:01:06.072857 4697 generic.go:334] "Generic (PLEG): container finished" podID="671fae5d-08e8-4fac-ba16-e33a5a4f1f0b" containerID="11d07b69cc74e250973d8d6ced1c60be6b316a1dadb290e7a515e573f0503624" exitCode=0 Feb 20 17:01:06 crc kubenswrapper[4697]: I0220 17:01:06.072884 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" event={"ID":"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b","Type":"ContainerDied","Data":"11d07b69cc74e250973d8d6ced1c60be6b316a1dadb290e7a515e573f0503624"} Feb 20 17:01:06 crc kubenswrapper[4697]: I0220 17:01:06.895979 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962571a7-edad-4e80-94b1-f8f1ba8621e4" path="/var/lib/kubelet/pods/962571a7-edad-4e80-94b1-f8f1ba8621e4/volumes" Feb 20 17:01:07 crc kubenswrapper[4697]: I0220 17:01:07.052586 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-v6dg9"] Feb 20 17:01:07 crc kubenswrapper[4697]: I0220 17:01:07.072412 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-v6dg9"] Feb 20 17:01:07 crc kubenswrapper[4697]: I0220 17:01:07.511166 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 17:01:07 crc kubenswrapper[4697]: I0220 17:01:07.679348 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-ssh-key-openstack-edpm-ipam\") pod \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\" (UID: \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\") " Feb 20 17:01:07 crc kubenswrapper[4697]: I0220 17:01:07.679500 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hglqq\" (UniqueName: \"kubernetes.io/projected/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-kube-api-access-hglqq\") pod \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\" (UID: \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\") " Feb 20 17:01:07 crc kubenswrapper[4697]: I0220 17:01:07.679581 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-inventory\") pod \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\" (UID: \"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b\") " Feb 20 17:01:07 crc kubenswrapper[4697]: I0220 17:01:07.686696 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-kube-api-access-hglqq" (OuterVolumeSpecName: "kube-api-access-hglqq") pod "671fae5d-08e8-4fac-ba16-e33a5a4f1f0b" (UID: "671fae5d-08e8-4fac-ba16-e33a5a4f1f0b"). InnerVolumeSpecName "kube-api-access-hglqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:01:07 crc kubenswrapper[4697]: I0220 17:01:07.707345 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "671fae5d-08e8-4fac-ba16-e33a5a4f1f0b" (UID: "671fae5d-08e8-4fac-ba16-e33a5a4f1f0b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:01:07 crc kubenswrapper[4697]: I0220 17:01:07.707535 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-inventory" (OuterVolumeSpecName: "inventory") pod "671fae5d-08e8-4fac-ba16-e33a5a4f1f0b" (UID: "671fae5d-08e8-4fac-ba16-e33a5a4f1f0b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:01:07 crc kubenswrapper[4697]: I0220 17:01:07.781785 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hglqq\" (UniqueName: \"kubernetes.io/projected/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-kube-api-access-hglqq\") on node \"crc\" DevicePath \"\"" Feb 20 17:01:07 crc kubenswrapper[4697]: I0220 17:01:07.781814 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:01:07 crc kubenswrapper[4697]: I0220 17:01:07.781825 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/671fae5d-08e8-4fac-ba16-e33a5a4f1f0b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.096126 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" event={"ID":"671fae5d-08e8-4fac-ba16-e33a5a4f1f0b","Type":"ContainerDied","Data":"a1aae0be6455ecce5d4338edbf2881299dc6b52e77b9e21c3f010aeffd3c6df1"} Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.096170 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1aae0be6455ecce5d4338edbf2881299dc6b52e77b9e21c3f010aeffd3c6df1" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.096234 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.189255 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt"] Feb 20 17:01:08 crc kubenswrapper[4697]: E0220 17:01:08.190196 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671fae5d-08e8-4fac-ba16-e33a5a4f1f0b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.190222 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="671fae5d-08e8-4fac-ba16-e33a5a4f1f0b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 20 17:01:08 crc kubenswrapper[4697]: E0220 17:01:08.190248 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8e173e-11e6-4bc4-a87e-58fd25b53076" containerName="keystone-cron" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.190258 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8e173e-11e6-4bc4-a87e-58fd25b53076" containerName="keystone-cron" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.190612 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="671fae5d-08e8-4fac-ba16-e33a5a4f1f0b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.190641 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8e173e-11e6-4bc4-a87e-58fd25b53076" containerName="keystone-cron" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.191760 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.194342 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.195288 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.195316 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.195363 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.205660 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt"] Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.393922 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2756b57-da81-4893-85d4-119fe103b4de-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt\" (UID: \"a2756b57-da81-4893-85d4-119fe103b4de\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.394169 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vcfw\" (UniqueName: \"kubernetes.io/projected/a2756b57-da81-4893-85d4-119fe103b4de-kube-api-access-4vcfw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt\" (UID: \"a2756b57-da81-4893-85d4-119fe103b4de\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.394418 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2756b57-da81-4893-85d4-119fe103b4de-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt\" (UID: \"a2756b57-da81-4893-85d4-119fe103b4de\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.495997 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2756b57-da81-4893-85d4-119fe103b4de-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt\" (UID: \"a2756b57-da81-4893-85d4-119fe103b4de\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.496056 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vcfw\" (UniqueName: \"kubernetes.io/projected/a2756b57-da81-4893-85d4-119fe103b4de-kube-api-access-4vcfw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt\" (UID: \"a2756b57-da81-4893-85d4-119fe103b4de\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.496098 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2756b57-da81-4893-85d4-119fe103b4de-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt\" (UID: \"a2756b57-da81-4893-85d4-119fe103b4de\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.503479 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2756b57-da81-4893-85d4-119fe103b4de-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt\" (UID: \"a2756b57-da81-4893-85d4-119fe103b4de\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.503900 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2756b57-da81-4893-85d4-119fe103b4de-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt\" (UID: \"a2756b57-da81-4893-85d4-119fe103b4de\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.516984 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vcfw\" (UniqueName: \"kubernetes.io/projected/a2756b57-da81-4893-85d4-119fe103b4de-kube-api-access-4vcfw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt\" (UID: \"a2756b57-da81-4893-85d4-119fe103b4de\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.815373 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:01:08 crc kubenswrapper[4697]: I0220 17:01:08.897002 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52cf1f6d-5da1-4b45-9dda-dbd21e972107" path="/var/lib/kubelet/pods/52cf1f6d-5da1-4b45-9dda-dbd21e972107/volumes" Feb 20 17:01:09 crc kubenswrapper[4697]: I0220 17:01:09.382491 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt"] Feb 20 17:01:10 crc kubenswrapper[4697]: I0220 17:01:10.121168 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" event={"ID":"a2756b57-da81-4893-85d4-119fe103b4de","Type":"ContainerStarted","Data":"1a04a8489282f1e346139a2e1c1ff254604ad0ef53aacaa6df68ca664cb38960"} Feb 20 17:01:10 crc kubenswrapper[4697]: I0220 17:01:10.122090 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" event={"ID":"a2756b57-da81-4893-85d4-119fe103b4de","Type":"ContainerStarted","Data":"b327e217d35c6d93e72eb94929487855351a78ab8cea055d7cec8f3866e07458"} Feb 20 17:01:10 crc kubenswrapper[4697]: I0220 17:01:10.144104 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" podStartSLOduration=1.6659517940000002 podStartE2EDuration="2.144085782s" podCreationTimestamp="2026-02-20 17:01:08 +0000 UTC" firstStartedPulling="2026-02-20 17:01:09.39059247 +0000 UTC m=+1777.170637888" lastFinishedPulling="2026-02-20 17:01:09.868726478 +0000 UTC m=+1777.648771876" observedRunningTime="2026-02-20 17:01:10.142669217 +0000 UTC m=+1777.922714675" watchObservedRunningTime="2026-02-20 17:01:10.144085782 +0000 UTC m=+1777.924131190" Feb 20 17:01:10 crc kubenswrapper[4697]: I0220 17:01:10.877203 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 17:01:10 crc kubenswrapper[4697]: E0220 17:01:10.879180 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:01:22 crc kubenswrapper[4697]: I0220 17:01:22.884093 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 17:01:22 crc kubenswrapper[4697]: E0220 17:01:22.885173 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:01:35 crc kubenswrapper[4697]: I0220 17:01:35.050518 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-k8pk7"] Feb 20 17:01:35 crc kubenswrapper[4697]: I0220 17:01:35.061846 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-k8pk7"] Feb 20 17:01:36 crc kubenswrapper[4697]: I0220 17:01:36.889565 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5959da8c-890a-4acb-9781-71b7c9fb33a5" path="/var/lib/kubelet/pods/5959da8c-890a-4acb-9781-71b7c9fb33a5/volumes" Feb 20 17:01:37 crc kubenswrapper[4697]: I0220 17:01:37.877042 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 17:01:38 crc kubenswrapper[4697]: I0220 17:01:38.437000 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"5ab741227b7557de30c7c29e950e092010c8b169886e816c84c4df58e588aa0d"} Feb 20 17:01:44 crc kubenswrapper[4697]: I0220 17:01:44.475086 4697 scope.go:117] "RemoveContainer" containerID="49c1bdfa25b7858554685eee5a89282cf25449f7f201400f665d8733e5795731" Feb 20 17:01:44 crc kubenswrapper[4697]: I0220 17:01:44.510079 4697 scope.go:117] "RemoveContainer" containerID="a4b4e357940009fb44409ab931c02605e36f2e3f92444a7c26bd5340d07bd903" Feb 20 17:01:44 crc kubenswrapper[4697]: I0220 17:01:44.581708 4697 scope.go:117] "RemoveContainer" containerID="95a397c462e7f2e5fa2c00f58fbfe54275d815e830be489dda70c58d23cdba9a" Feb 20 17:01:44 crc kubenswrapper[4697]: I0220 17:01:44.619417 4697 scope.go:117] "RemoveContainer" containerID="1787c7c15684669833a35fc56d74a3c79b5bff5b74c598b121e187fb68c5191d" Feb 20 17:01:44 crc kubenswrapper[4697]: I0220 17:01:44.685267 4697 scope.go:117] "RemoveContainer" containerID="55f783900b9f281c401d65d9c7635229d628e5c1c43ff6a9627d4f15d3b15231" Feb 20 17:01:44 crc kubenswrapper[4697]: I0220 17:01:44.714951 4697 scope.go:117] "RemoveContainer" containerID="501af9233f29b879b25f5ae5f15e15ebd10c2fc86ee1070a21ad8753d2d078f8" Feb 20 17:01:44 crc kubenswrapper[4697]: I0220 17:01:44.776420 4697 scope.go:117] "RemoveContainer" containerID="31ebe6109cf2053b73b2151a184f8a4aab762a373433c298cd578bb25ba70f58" Feb 20 17:01:44 crc kubenswrapper[4697]: I0220 17:01:44.801561 4697 scope.go:117] "RemoveContainer" containerID="401a02c5a3493603acce6400b53996e036e2e84421253931a083e50d178add7d" Feb 20 17:01:44 crc kubenswrapper[4697]: I0220 17:01:44.825874 4697 scope.go:117] "RemoveContainer" containerID="701fbb508a90afd52fab6ef8aacdcb92353cd3b1bea2e88c2e4b55dc45265dd6" Feb 20 17:01:44 crc kubenswrapper[4697]: I0220 17:01:44.873636 4697 scope.go:117] "RemoveContainer" containerID="9816d60e5ac117fb51a276c86ccbe4935cbb8389ee92837b5e77762786a9b101" Feb 20 17:01:49 crc kubenswrapper[4697]: I0220 17:01:49.037631 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-lv84x"] Feb 20 17:01:49 crc kubenswrapper[4697]: I0220 17:01:49.045230 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-lv84x"] Feb 20 17:01:50 crc kubenswrapper[4697]: I0220 17:01:50.892324 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1aba0bf-7e2c-4a14-894d-d1247f7356eb" path="/var/lib/kubelet/pods/d1aba0bf-7e2c-4a14-894d-d1247f7356eb/volumes" Feb 20 17:01:51 crc kubenswrapper[4697]: I0220 17:01:51.040739 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dz6jg"] Feb 20 17:01:51 crc kubenswrapper[4697]: I0220 17:01:51.054787 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dz6jg"] Feb 20 17:01:52 crc kubenswrapper[4697]: I0220 17:01:52.888526 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fa3e27-befa-408d-ac73-377e04786963" path="/var/lib/kubelet/pods/36fa3e27-befa-408d-ac73-377e04786963/volumes" Feb 20 17:01:54 crc kubenswrapper[4697]: I0220 17:01:54.039736 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-hjgnj"] Feb 20 17:01:54 crc kubenswrapper[4697]: I0220 17:01:54.052887 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-hjgnj"] Feb 20 17:01:54 crc kubenswrapper[4697]: I0220 17:01:54.895912 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39238d29-1c84-4169-b011-455bd2e7f000" path="/var/lib/kubelet/pods/39238d29-1c84-4169-b011-455bd2e7f000/volumes" Feb 20 17:02:15 crc kubenswrapper[4697]: I0220 17:02:15.033093 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-dzcml"] Feb 20 17:02:15 crc kubenswrapper[4697]: I0220 17:02:15.045120 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-dzcml"] Feb 20 17:02:16 crc kubenswrapper[4697]: I0220 17:02:16.896308 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc8e40c-20d6-41e9-9f4e-25112a77e115" path="/var/lib/kubelet/pods/4bc8e40c-20d6-41e9-9f4e-25112a77e115/volumes" Feb 20 17:02:17 crc kubenswrapper[4697]: I0220 17:02:17.982706 4697 generic.go:334] "Generic (PLEG): container finished" podID="a2756b57-da81-4893-85d4-119fe103b4de" containerID="1a04a8489282f1e346139a2e1c1ff254604ad0ef53aacaa6df68ca664cb38960" exitCode=0 Feb 20 17:02:17 crc kubenswrapper[4697]: I0220 17:02:17.982748 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" event={"ID":"a2756b57-da81-4893-85d4-119fe103b4de","Type":"ContainerDied","Data":"1a04a8489282f1e346139a2e1c1ff254604ad0ef53aacaa6df68ca664cb38960"} Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.399949 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.568733 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vcfw\" (UniqueName: \"kubernetes.io/projected/a2756b57-da81-4893-85d4-119fe103b4de-kube-api-access-4vcfw\") pod \"a2756b57-da81-4893-85d4-119fe103b4de\" (UID: \"a2756b57-da81-4893-85d4-119fe103b4de\") " Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.568785 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2756b57-da81-4893-85d4-119fe103b4de-inventory\") pod \"a2756b57-da81-4893-85d4-119fe103b4de\" (UID: \"a2756b57-da81-4893-85d4-119fe103b4de\") " Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.568889 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2756b57-da81-4893-85d4-119fe103b4de-ssh-key-openstack-edpm-ipam\") pod \"a2756b57-da81-4893-85d4-119fe103b4de\" (UID: \"a2756b57-da81-4893-85d4-119fe103b4de\") " Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.574530 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2756b57-da81-4893-85d4-119fe103b4de-kube-api-access-4vcfw" (OuterVolumeSpecName: "kube-api-access-4vcfw") pod "a2756b57-da81-4893-85d4-119fe103b4de" (UID: "a2756b57-da81-4893-85d4-119fe103b4de"). InnerVolumeSpecName "kube-api-access-4vcfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.603720 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2756b57-da81-4893-85d4-119fe103b4de-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a2756b57-da81-4893-85d4-119fe103b4de" (UID: "a2756b57-da81-4893-85d4-119fe103b4de"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.605326 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2756b57-da81-4893-85d4-119fe103b4de-inventory" (OuterVolumeSpecName: "inventory") pod "a2756b57-da81-4893-85d4-119fe103b4de" (UID: "a2756b57-da81-4893-85d4-119fe103b4de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.673360 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2756b57-da81-4893-85d4-119fe103b4de-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.673391 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vcfw\" (UniqueName: \"kubernetes.io/projected/a2756b57-da81-4893-85d4-119fe103b4de-kube-api-access-4vcfw\") on node \"crc\" DevicePath \"\"" Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.673401 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2756b57-da81-4893-85d4-119fe103b4de-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.999312 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" event={"ID":"a2756b57-da81-4893-85d4-119fe103b4de","Type":"ContainerDied","Data":"b327e217d35c6d93e72eb94929487855351a78ab8cea055d7cec8f3866e07458"} Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.999356 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b327e217d35c6d93e72eb94929487855351a78ab8cea055d7cec8f3866e07458" Feb 20 17:02:19 crc kubenswrapper[4697]: I0220 17:02:19.999375 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.114691 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8"] Feb 20 17:02:20 crc kubenswrapper[4697]: E0220 17:02:20.115587 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2756b57-da81-4893-85d4-119fe103b4de" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.115623 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2756b57-da81-4893-85d4-119fe103b4de" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.115959 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2756b57-da81-4893-85d4-119fe103b4de" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.117172 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.123109 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.123325 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.123450 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.123804 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.125537 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8"] Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.284555 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdlvd\" (UniqueName: \"kubernetes.io/projected/0ed71fc6-e5c0-40fe-988e-04a30088f620-kube-api-access-jdlvd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8\" (UID: \"0ed71fc6-e5c0-40fe-988e-04a30088f620\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.284617 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ed71fc6-e5c0-40fe-988e-04a30088f620-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8\" (UID: \"0ed71fc6-e5c0-40fe-988e-04a30088f620\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.284695 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ed71fc6-e5c0-40fe-988e-04a30088f620-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8\" (UID: \"0ed71fc6-e5c0-40fe-988e-04a30088f620\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.386920 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdlvd\" (UniqueName: \"kubernetes.io/projected/0ed71fc6-e5c0-40fe-988e-04a30088f620-kube-api-access-jdlvd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8\" (UID: \"0ed71fc6-e5c0-40fe-988e-04a30088f620\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.386989 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ed71fc6-e5c0-40fe-988e-04a30088f620-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8\" (UID: \"0ed71fc6-e5c0-40fe-988e-04a30088f620\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.387025 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ed71fc6-e5c0-40fe-988e-04a30088f620-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8\" (UID: \"0ed71fc6-e5c0-40fe-988e-04a30088f620\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.390777 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ed71fc6-e5c0-40fe-988e-04a30088f620-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8\" (UID: \"0ed71fc6-e5c0-40fe-988e-04a30088f620\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.390866 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ed71fc6-e5c0-40fe-988e-04a30088f620-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8\" (UID: \"0ed71fc6-e5c0-40fe-988e-04a30088f620\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.405409 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdlvd\" (UniqueName: \"kubernetes.io/projected/0ed71fc6-e5c0-40fe-988e-04a30088f620-kube-api-access-jdlvd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8\" (UID: \"0ed71fc6-e5c0-40fe-988e-04a30088f620\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.450365 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:20 crc kubenswrapper[4697]: I0220 17:02:20.995091 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8"] Feb 20 17:02:21 crc kubenswrapper[4697]: I0220 17:02:21.010555 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" event={"ID":"0ed71fc6-e5c0-40fe-988e-04a30088f620","Type":"ContainerStarted","Data":"acc49566cd8e7bfcec3dab4346b1e82edbb1b6f10442f96820539edf23935075"} Feb 20 17:02:22 crc kubenswrapper[4697]: I0220 17:02:22.020197 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" event={"ID":"0ed71fc6-e5c0-40fe-988e-04a30088f620","Type":"ContainerStarted","Data":"fb0a10a87b9aa4243361031b1047d81710360fb92db2be3b373af46b3c515813"} Feb 20 17:02:22 crc kubenswrapper[4697]: I0220 17:02:22.049052 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" podStartSLOduration=1.512097292 podStartE2EDuration="2.049031841s" podCreationTimestamp="2026-02-20 17:02:20 +0000 UTC" firstStartedPulling="2026-02-20 17:02:21.003164584 +0000 UTC m=+1848.783209992" lastFinishedPulling="2026-02-20 17:02:21.540099113 +0000 UTC m=+1849.320144541" observedRunningTime="2026-02-20 17:02:22.041910165 +0000 UTC m=+1849.821955573" watchObservedRunningTime="2026-02-20 17:02:22.049031841 +0000 UTC m=+1849.829077249" Feb 20 17:02:27 crc kubenswrapper[4697]: I0220 17:02:27.068891 4697 generic.go:334] "Generic (PLEG): container finished" podID="0ed71fc6-e5c0-40fe-988e-04a30088f620" containerID="fb0a10a87b9aa4243361031b1047d81710360fb92db2be3b373af46b3c515813" exitCode=0 Feb 20 17:02:27 crc kubenswrapper[4697]: I0220 17:02:27.068996 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" event={"ID":"0ed71fc6-e5c0-40fe-988e-04a30088f620","Type":"ContainerDied","Data":"fb0a10a87b9aa4243361031b1047d81710360fb92db2be3b373af46b3c515813"} Feb 20 17:02:28 crc kubenswrapper[4697]: I0220 17:02:28.460255 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:28 crc kubenswrapper[4697]: I0220 17:02:28.481287 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ed71fc6-e5c0-40fe-988e-04a30088f620-ssh-key-openstack-edpm-ipam\") pod \"0ed71fc6-e5c0-40fe-988e-04a30088f620\" (UID: \"0ed71fc6-e5c0-40fe-988e-04a30088f620\") " Feb 20 17:02:28 crc kubenswrapper[4697]: I0220 17:02:28.481505 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdlvd\" (UniqueName: \"kubernetes.io/projected/0ed71fc6-e5c0-40fe-988e-04a30088f620-kube-api-access-jdlvd\") pod \"0ed71fc6-e5c0-40fe-988e-04a30088f620\" (UID: \"0ed71fc6-e5c0-40fe-988e-04a30088f620\") " Feb 20 17:02:28 crc kubenswrapper[4697]: I0220 17:02:28.481566 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ed71fc6-e5c0-40fe-988e-04a30088f620-inventory\") pod \"0ed71fc6-e5c0-40fe-988e-04a30088f620\" (UID: \"0ed71fc6-e5c0-40fe-988e-04a30088f620\") " Feb 20 17:02:28 crc kubenswrapper[4697]: I0220 17:02:28.487394 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ed71fc6-e5c0-40fe-988e-04a30088f620-kube-api-access-jdlvd" (OuterVolumeSpecName: "kube-api-access-jdlvd") pod "0ed71fc6-e5c0-40fe-988e-04a30088f620" (UID: "0ed71fc6-e5c0-40fe-988e-04a30088f620"). InnerVolumeSpecName "kube-api-access-jdlvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:02:28 crc kubenswrapper[4697]: I0220 17:02:28.507249 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ed71fc6-e5c0-40fe-988e-04a30088f620-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0ed71fc6-e5c0-40fe-988e-04a30088f620" (UID: "0ed71fc6-e5c0-40fe-988e-04a30088f620"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:02:28 crc kubenswrapper[4697]: I0220 17:02:28.507656 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ed71fc6-e5c0-40fe-988e-04a30088f620-inventory" (OuterVolumeSpecName: "inventory") pod "0ed71fc6-e5c0-40fe-988e-04a30088f620" (UID: "0ed71fc6-e5c0-40fe-988e-04a30088f620"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:02:28 crc kubenswrapper[4697]: I0220 17:02:28.584110 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdlvd\" (UniqueName: \"kubernetes.io/projected/0ed71fc6-e5c0-40fe-988e-04a30088f620-kube-api-access-jdlvd\") on node \"crc\" DevicePath \"\"" Feb 20 17:02:28 crc kubenswrapper[4697]: I0220 17:02:28.584142 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ed71fc6-e5c0-40fe-988e-04a30088f620-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:02:28 crc kubenswrapper[4697]: I0220 17:02:28.584152 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ed71fc6-e5c0-40fe-988e-04a30088f620-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.094158 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" event={"ID":"0ed71fc6-e5c0-40fe-988e-04a30088f620","Type":"ContainerDied","Data":"acc49566cd8e7bfcec3dab4346b1e82edbb1b6f10442f96820539edf23935075"} Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.094507 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acc49566cd8e7bfcec3dab4346b1e82edbb1b6f10442f96820539edf23935075" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.094221 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.178177 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs"] Feb 20 17:02:29 crc kubenswrapper[4697]: E0220 17:02:29.178666 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed71fc6-e5c0-40fe-988e-04a30088f620" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.178686 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed71fc6-e5c0-40fe-988e-04a30088f620" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.178945 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ed71fc6-e5c0-40fe-988e-04a30088f620" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.180016 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.184466 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.185141 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.185457 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.193893 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.194574 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs"] Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.196389 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6mlzs\" (UID: \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.196493 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9crgg\" (UniqueName: \"kubernetes.io/projected/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-kube-api-access-9crgg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6mlzs\" (UID: \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.196589 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6mlzs\" (UID: \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.299931 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9crgg\" (UniqueName: \"kubernetes.io/projected/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-kube-api-access-9crgg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6mlzs\" (UID: \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.300123 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6mlzs\" (UID: \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.300323 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6mlzs\" (UID: \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.309481 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6mlzs\" (UID: \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.312192 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6mlzs\" (UID: \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.317469 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9crgg\" (UniqueName: \"kubernetes.io/projected/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-kube-api-access-9crgg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6mlzs\" (UID: \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:02:29 crc kubenswrapper[4697]: I0220 17:02:29.507237 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:02:30 crc kubenswrapper[4697]: I0220 17:02:30.018780 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs"] Feb 20 17:02:30 crc kubenswrapper[4697]: I0220 17:02:30.103816 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" event={"ID":"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9","Type":"ContainerStarted","Data":"75283b0e07da306b6445c780c3b4dc90f6ea7383015a715e7e7ff603a7b839a0"} Feb 20 17:02:31 crc kubenswrapper[4697]: I0220 17:02:31.112802 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" event={"ID":"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9","Type":"ContainerStarted","Data":"0efa9a1a1c9fb1aba3e01e3c59242c31199ac5194a938bcf17295266e075b997"} Feb 20 17:02:31 crc kubenswrapper[4697]: I0220 17:02:31.143476 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" podStartSLOduration=1.740343223 podStartE2EDuration="2.14345548s" podCreationTimestamp="2026-02-20 17:02:29 +0000 UTC" firstStartedPulling="2026-02-20 17:02:30.012399462 +0000 UTC m=+1857.792444870" lastFinishedPulling="2026-02-20 17:02:30.415511719 +0000 UTC m=+1858.195557127" observedRunningTime="2026-02-20 17:02:31.132246084 +0000 UTC m=+1858.912291522" watchObservedRunningTime="2026-02-20 17:02:31.14345548 +0000 UTC m=+1858.923500888" Feb 20 17:02:45 crc kubenswrapper[4697]: I0220 17:02:45.053332 4697 scope.go:117] "RemoveContainer" containerID="b4f4b6c7eceae72eaf36f93145369ac145650bbb3d018f5decf6183a22506971" Feb 20 17:02:45 crc kubenswrapper[4697]: I0220 17:02:45.106593 4697 scope.go:117] "RemoveContainer" containerID="5e7053b3a6eaabbc80d3c16eb184fb5761b6d49350a579ad316eebd23ea2b330" Feb 20 17:02:45 crc kubenswrapper[4697]: I0220 17:02:45.144245 4697 scope.go:117] "RemoveContainer" containerID="d204f0b69a294f0c73e9d8e10974149f4b0bd36582ae08b87b1ebcae8059eacc" Feb 20 17:02:45 crc kubenswrapper[4697]: I0220 17:02:45.199403 4697 scope.go:117] "RemoveContainer" containerID="27bc6ecc4912db01e3f0bc28e595316ae27904ad3409f54a97d4a9cec5f1a769" Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.493330 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dwg7l"] Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.495903 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.507833 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dwg7l"] Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.517787 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e38df26d-85b1-4da3-9adc-355d959fb297-catalog-content\") pod \"certified-operators-dwg7l\" (UID: \"e38df26d-85b1-4da3-9adc-355d959fb297\") " pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.517884 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e38df26d-85b1-4da3-9adc-355d959fb297-utilities\") pod \"certified-operators-dwg7l\" (UID: \"e38df26d-85b1-4da3-9adc-355d959fb297\") " pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.517913 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pj6k\" (UniqueName: \"kubernetes.io/projected/e38df26d-85b1-4da3-9adc-355d959fb297-kube-api-access-2pj6k\") pod \"certified-operators-dwg7l\" (UID: \"e38df26d-85b1-4da3-9adc-355d959fb297\") " pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.619960 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e38df26d-85b1-4da3-9adc-355d959fb297-catalog-content\") pod \"certified-operators-dwg7l\" (UID: \"e38df26d-85b1-4da3-9adc-355d959fb297\") " pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.620075 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e38df26d-85b1-4da3-9adc-355d959fb297-utilities\") pod \"certified-operators-dwg7l\" (UID: \"e38df26d-85b1-4da3-9adc-355d959fb297\") " pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.620101 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pj6k\" (UniqueName: \"kubernetes.io/projected/e38df26d-85b1-4da3-9adc-355d959fb297-kube-api-access-2pj6k\") pod \"certified-operators-dwg7l\" (UID: \"e38df26d-85b1-4da3-9adc-355d959fb297\") " pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.620934 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e38df26d-85b1-4da3-9adc-355d959fb297-catalog-content\") pod \"certified-operators-dwg7l\" (UID: \"e38df26d-85b1-4da3-9adc-355d959fb297\") " pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.620970 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e38df26d-85b1-4da3-9adc-355d959fb297-utilities\") pod \"certified-operators-dwg7l\" (UID: \"e38df26d-85b1-4da3-9adc-355d959fb297\") " pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.651213 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pj6k\" (UniqueName: \"kubernetes.io/projected/e38df26d-85b1-4da3-9adc-355d959fb297-kube-api-access-2pj6k\") pod \"certified-operators-dwg7l\" (UID: \"e38df26d-85b1-4da3-9adc-355d959fb297\") " pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:48 crc kubenswrapper[4697]: I0220 17:02:48.816229 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:49 crc kubenswrapper[4697]: I0220 17:02:49.295763 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dwg7l"] Feb 20 17:02:49 crc kubenswrapper[4697]: I0220 17:02:49.320608 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwg7l" event={"ID":"e38df26d-85b1-4da3-9adc-355d959fb297","Type":"ContainerStarted","Data":"a9b46b44960eb9ef1f87e6875f86477c4d8d56f662aeea13eff0008b914d6498"} Feb 20 17:02:50 crc kubenswrapper[4697]: I0220 17:02:50.042381 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-246e-account-create-update-5h8zj"] Feb 20 17:02:50 crc kubenswrapper[4697]: I0220 17:02:50.050142 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-246e-account-create-update-5h8zj"] Feb 20 17:02:50 crc kubenswrapper[4697]: I0220 17:02:50.330797 4697 generic.go:334] "Generic (PLEG): container finished" podID="e38df26d-85b1-4da3-9adc-355d959fb297" containerID="f1dea227fbc569dd7f930e46363a17cbb1daac2b1d2a0e689ff7bfb2f0a1cc0c" exitCode=0 Feb 20 17:02:50 crc kubenswrapper[4697]: I0220 17:02:50.330842 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwg7l" event={"ID":"e38df26d-85b1-4da3-9adc-355d959fb297","Type":"ContainerDied","Data":"f1dea227fbc569dd7f930e46363a17cbb1daac2b1d2a0e689ff7bfb2f0a1cc0c"} Feb 20 17:02:50 crc kubenswrapper[4697]: I0220 17:02:50.900347 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfab7d0c-8b89-47a2-a01f-03e8962ccb92" path="/var/lib/kubelet/pods/dfab7d0c-8b89-47a2-a01f-03e8962ccb92/volumes" Feb 20 17:02:50 crc kubenswrapper[4697]: I0220 17:02:50.901020 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7fcjr"] Feb 20 17:02:50 crc kubenswrapper[4697]: I0220 17:02:50.931321 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:02:50 crc kubenswrapper[4697]: I0220 17:02:50.931967 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fcjr"] Feb 20 17:02:50 crc kubenswrapper[4697]: I0220 17:02:50.967846 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5nqm\" (UniqueName: \"kubernetes.io/projected/e26bc255-664e-4b7c-bfd6-682e1feb0b12-kube-api-access-g5nqm\") pod \"redhat-operators-7fcjr\" (UID: \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\") " pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:02:50 crc kubenswrapper[4697]: I0220 17:02:50.968470 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26bc255-664e-4b7c-bfd6-682e1feb0b12-utilities\") pod \"redhat-operators-7fcjr\" (UID: \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\") " pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:02:50 crc kubenswrapper[4697]: I0220 17:02:50.968572 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26bc255-664e-4b7c-bfd6-682e1feb0b12-catalog-content\") pod \"redhat-operators-7fcjr\" (UID: \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\") " pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.028391 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fd79-account-create-update-gz4br"] Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.036793 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-j9gxf"] Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.044241 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-jhkvr"] Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.053491 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1713-account-create-update-6vhb6"] Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.062043 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-jhkvr"] Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.069750 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7pd9r"] Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.070646 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5nqm\" (UniqueName: \"kubernetes.io/projected/e26bc255-664e-4b7c-bfd6-682e1feb0b12-kube-api-access-g5nqm\") pod \"redhat-operators-7fcjr\" (UID: \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\") " pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.070705 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26bc255-664e-4b7c-bfd6-682e1feb0b12-utilities\") pod \"redhat-operators-7fcjr\" (UID: \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\") " pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.070735 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26bc255-664e-4b7c-bfd6-682e1feb0b12-catalog-content\") pod \"redhat-operators-7fcjr\" (UID: \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\") " pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.071228 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26bc255-664e-4b7c-bfd6-682e1feb0b12-catalog-content\") pod \"redhat-operators-7fcjr\" (UID: \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\") " pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.071839 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26bc255-664e-4b7c-bfd6-682e1feb0b12-utilities\") pod \"redhat-operators-7fcjr\" (UID: \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\") " pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.078908 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-j9gxf"] Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.088491 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1713-account-create-update-6vhb6"] Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.091409 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5nqm\" (UniqueName: \"kubernetes.io/projected/e26bc255-664e-4b7c-bfd6-682e1feb0b12-kube-api-access-g5nqm\") pod \"redhat-operators-7fcjr\" (UID: \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\") " pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.094029 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-fd79-account-create-update-gz4br"] Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.103830 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7pd9r"] Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.249688 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.348634 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwg7l" event={"ID":"e38df26d-85b1-4da3-9adc-355d959fb297","Type":"ContainerStarted","Data":"968037da77b0d20716debd8f0c3c23e5561f4572497f7f37dda20ad0b750b3e8"} Feb 20 17:02:51 crc kubenswrapper[4697]: I0220 17:02:51.753570 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fcjr"] Feb 20 17:02:52 crc kubenswrapper[4697]: I0220 17:02:52.357670 4697 generic.go:334] "Generic (PLEG): container finished" podID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" containerID="86a2c81194c2b9087defc4d2d42d10c3805d63cd78f0646511ac12d859103cd4" exitCode=0 Feb 20 17:02:52 crc kubenswrapper[4697]: I0220 17:02:52.357755 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fcjr" event={"ID":"e26bc255-664e-4b7c-bfd6-682e1feb0b12","Type":"ContainerDied","Data":"86a2c81194c2b9087defc4d2d42d10c3805d63cd78f0646511ac12d859103cd4"} Feb 20 17:02:52 crc kubenswrapper[4697]: I0220 17:02:52.358116 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fcjr" event={"ID":"e26bc255-664e-4b7c-bfd6-682e1feb0b12","Type":"ContainerStarted","Data":"9180f78b153b4b1aa6e4937b8993f6d283a4e9b08ff95ea8005765209da9205e"} Feb 20 17:02:52 crc kubenswrapper[4697]: I0220 17:02:52.892609 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f2fd71a-c008-4059-b649-f881e3809691" path="/var/lib/kubelet/pods/5f2fd71a-c008-4059-b649-f881e3809691/volumes" Feb 20 17:02:52 crc kubenswrapper[4697]: I0220 17:02:52.893386 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a0cc76-ae5d-4ed2-898b-8e9ade19ada3" path="/var/lib/kubelet/pods/95a0cc76-ae5d-4ed2-898b-8e9ade19ada3/volumes" Feb 20 17:02:52 crc kubenswrapper[4697]: I0220 17:02:52.894130 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97792b78-572a-47ab-9ed8-381adea8950c" path="/var/lib/kubelet/pods/97792b78-572a-47ab-9ed8-381adea8950c/volumes" Feb 20 17:02:52 crc kubenswrapper[4697]: I0220 17:02:52.894849 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaa21f75-ab3a-4a94-8350-a891760b38b8" path="/var/lib/kubelet/pods/aaa21f75-ab3a-4a94-8350-a891760b38b8/volumes" Feb 20 17:02:52 crc kubenswrapper[4697]: I0220 17:02:52.896231 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1fc0882-9076-4159-86d9-bf9795206baf" path="/var/lib/kubelet/pods/d1fc0882-9076-4159-86d9-bf9795206baf/volumes" Feb 20 17:02:53 crc kubenswrapper[4697]: I0220 17:02:53.369199 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fcjr" event={"ID":"e26bc255-664e-4b7c-bfd6-682e1feb0b12","Type":"ContainerStarted","Data":"a689ec6cf946569c8cba6035d151dd36c98919b9a5395047f9c8cff690de1d20"} Feb 20 17:02:53 crc kubenswrapper[4697]: I0220 17:02:53.372118 4697 generic.go:334] "Generic (PLEG): container finished" podID="e38df26d-85b1-4da3-9adc-355d959fb297" containerID="968037da77b0d20716debd8f0c3c23e5561f4572497f7f37dda20ad0b750b3e8" exitCode=0 Feb 20 17:02:53 crc kubenswrapper[4697]: I0220 17:02:53.372156 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwg7l" event={"ID":"e38df26d-85b1-4da3-9adc-355d959fb297","Type":"ContainerDied","Data":"968037da77b0d20716debd8f0c3c23e5561f4572497f7f37dda20ad0b750b3e8"} Feb 20 17:02:54 crc kubenswrapper[4697]: I0220 17:02:54.381247 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwg7l" event={"ID":"e38df26d-85b1-4da3-9adc-355d959fb297","Type":"ContainerStarted","Data":"b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb"} Feb 20 17:02:54 crc kubenswrapper[4697]: I0220 17:02:54.408531 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dwg7l" podStartSLOduration=2.80975793 podStartE2EDuration="6.408512055s" podCreationTimestamp="2026-02-20 17:02:48 +0000 UTC" firstStartedPulling="2026-02-20 17:02:50.33248633 +0000 UTC m=+1878.112531738" lastFinishedPulling="2026-02-20 17:02:53.931240455 +0000 UTC m=+1881.711285863" observedRunningTime="2026-02-20 17:02:54.400762248 +0000 UTC m=+1882.180807676" watchObservedRunningTime="2026-02-20 17:02:54.408512055 +0000 UTC m=+1882.188557463" Feb 20 17:02:58 crc kubenswrapper[4697]: I0220 17:02:58.416555 4697 generic.go:334] "Generic (PLEG): container finished" podID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" containerID="a689ec6cf946569c8cba6035d151dd36c98919b9a5395047f9c8cff690de1d20" exitCode=0 Feb 20 17:02:58 crc kubenswrapper[4697]: I0220 17:02:58.417105 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fcjr" event={"ID":"e26bc255-664e-4b7c-bfd6-682e1feb0b12","Type":"ContainerDied","Data":"a689ec6cf946569c8cba6035d151dd36c98919b9a5395047f9c8cff690de1d20"} Feb 20 17:02:58 crc kubenswrapper[4697]: I0220 17:02:58.816552 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:58 crc kubenswrapper[4697]: I0220 17:02:58.816876 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:58 crc kubenswrapper[4697]: I0220 17:02:58.866664 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:02:59 crc kubenswrapper[4697]: I0220 17:02:59.429802 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fcjr" event={"ID":"e26bc255-664e-4b7c-bfd6-682e1feb0b12","Type":"ContainerStarted","Data":"cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6"} Feb 20 17:02:59 crc kubenswrapper[4697]: I0220 17:02:59.457962 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7fcjr" podStartSLOduration=2.766699445 podStartE2EDuration="9.457944368s" podCreationTimestamp="2026-02-20 17:02:50 +0000 UTC" firstStartedPulling="2026-02-20 17:02:52.359272009 +0000 UTC m=+1880.139317417" lastFinishedPulling="2026-02-20 17:02:59.050516932 +0000 UTC m=+1886.830562340" observedRunningTime="2026-02-20 17:02:59.451063335 +0000 UTC m=+1887.231108743" watchObservedRunningTime="2026-02-20 17:02:59.457944368 +0000 UTC m=+1887.237989776" Feb 20 17:02:59 crc kubenswrapper[4697]: I0220 17:02:59.514592 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:03:01 crc kubenswrapper[4697]: I0220 17:03:01.082150 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dwg7l"] Feb 20 17:03:01 crc kubenswrapper[4697]: I0220 17:03:01.250646 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:03:01 crc kubenswrapper[4697]: I0220 17:03:01.250980 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:03:02 crc kubenswrapper[4697]: I0220 17:03:02.312689 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7fcjr" podUID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" containerName="registry-server" probeResult="failure" output=< Feb 20 17:03:02 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 17:03:02 crc kubenswrapper[4697]: > Feb 20 17:03:02 crc kubenswrapper[4697]: I0220 17:03:02.453295 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dwg7l" podUID="e38df26d-85b1-4da3-9adc-355d959fb297" containerName="registry-server" containerID="cri-o://b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb" gracePeriod=2 Feb 20 17:03:02 crc kubenswrapper[4697]: I0220 17:03:02.926652 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.010635 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e38df26d-85b1-4da3-9adc-355d959fb297-utilities\") pod \"e38df26d-85b1-4da3-9adc-355d959fb297\" (UID: \"e38df26d-85b1-4da3-9adc-355d959fb297\") " Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.010765 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pj6k\" (UniqueName: \"kubernetes.io/projected/e38df26d-85b1-4da3-9adc-355d959fb297-kube-api-access-2pj6k\") pod \"e38df26d-85b1-4da3-9adc-355d959fb297\" (UID: \"e38df26d-85b1-4da3-9adc-355d959fb297\") " Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.010814 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e38df26d-85b1-4da3-9adc-355d959fb297-catalog-content\") pod \"e38df26d-85b1-4da3-9adc-355d959fb297\" (UID: \"e38df26d-85b1-4da3-9adc-355d959fb297\") " Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.011630 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e38df26d-85b1-4da3-9adc-355d959fb297-utilities" (OuterVolumeSpecName: "utilities") pod "e38df26d-85b1-4da3-9adc-355d959fb297" (UID: "e38df26d-85b1-4da3-9adc-355d959fb297"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.020649 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e38df26d-85b1-4da3-9adc-355d959fb297-kube-api-access-2pj6k" (OuterVolumeSpecName: "kube-api-access-2pj6k") pod "e38df26d-85b1-4da3-9adc-355d959fb297" (UID: "e38df26d-85b1-4da3-9adc-355d959fb297"). InnerVolumeSpecName "kube-api-access-2pj6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.070444 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e38df26d-85b1-4da3-9adc-355d959fb297-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e38df26d-85b1-4da3-9adc-355d959fb297" (UID: "e38df26d-85b1-4da3-9adc-355d959fb297"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.114216 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e38df26d-85b1-4da3-9adc-355d959fb297-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.114258 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pj6k\" (UniqueName: \"kubernetes.io/projected/e38df26d-85b1-4da3-9adc-355d959fb297-kube-api-access-2pj6k\") on node \"crc\" DevicePath \"\"" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.114269 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e38df26d-85b1-4da3-9adc-355d959fb297-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.466136 4697 generic.go:334] "Generic (PLEG): container finished" podID="e38df26d-85b1-4da3-9adc-355d959fb297" containerID="b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb" exitCode=0 Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.466179 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwg7l" event={"ID":"e38df26d-85b1-4da3-9adc-355d959fb297","Type":"ContainerDied","Data":"b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb"} Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.466204 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwg7l" event={"ID":"e38df26d-85b1-4da3-9adc-355d959fb297","Type":"ContainerDied","Data":"a9b46b44960eb9ef1f87e6875f86477c4d8d56f662aeea13eff0008b914d6498"} Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.466220 4697 scope.go:117] "RemoveContainer" containerID="b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.466341 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwg7l" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.494409 4697 scope.go:117] "RemoveContainer" containerID="968037da77b0d20716debd8f0c3c23e5561f4572497f7f37dda20ad0b750b3e8" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.506246 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dwg7l"] Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.521040 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dwg7l"] Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.524597 4697 scope.go:117] "RemoveContainer" containerID="f1dea227fbc569dd7f930e46363a17cbb1daac2b1d2a0e689ff7bfb2f0a1cc0c" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.578523 4697 scope.go:117] "RemoveContainer" containerID="b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb" Feb 20 17:03:03 crc kubenswrapper[4697]: E0220 17:03:03.578998 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb\": container with ID starting with b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb not found: ID does not exist" containerID="b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.579061 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb"} err="failed to get container status \"b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb\": rpc error: code = NotFound desc = could not find container \"b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb\": container with ID starting with b29e662af2e3d2594f7acc0ff8ad3224640bb9723ea3bdfa541728c23cab04fb not found: ID does not exist" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.579104 4697 scope.go:117] "RemoveContainer" containerID="968037da77b0d20716debd8f0c3c23e5561f4572497f7f37dda20ad0b750b3e8" Feb 20 17:03:03 crc kubenswrapper[4697]: E0220 17:03:03.579942 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"968037da77b0d20716debd8f0c3c23e5561f4572497f7f37dda20ad0b750b3e8\": container with ID starting with 968037da77b0d20716debd8f0c3c23e5561f4572497f7f37dda20ad0b750b3e8 not found: ID does not exist" containerID="968037da77b0d20716debd8f0c3c23e5561f4572497f7f37dda20ad0b750b3e8" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.580113 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"968037da77b0d20716debd8f0c3c23e5561f4572497f7f37dda20ad0b750b3e8"} err="failed to get container status \"968037da77b0d20716debd8f0c3c23e5561f4572497f7f37dda20ad0b750b3e8\": rpc error: code = NotFound desc = could not find container \"968037da77b0d20716debd8f0c3c23e5561f4572497f7f37dda20ad0b750b3e8\": container with ID starting with 968037da77b0d20716debd8f0c3c23e5561f4572497f7f37dda20ad0b750b3e8 not found: ID does not exist" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.580261 4697 scope.go:117] "RemoveContainer" containerID="f1dea227fbc569dd7f930e46363a17cbb1daac2b1d2a0e689ff7bfb2f0a1cc0c" Feb 20 17:03:03 crc kubenswrapper[4697]: E0220 17:03:03.580719 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1dea227fbc569dd7f930e46363a17cbb1daac2b1d2a0e689ff7bfb2f0a1cc0c\": container with ID starting with f1dea227fbc569dd7f930e46363a17cbb1daac2b1d2a0e689ff7bfb2f0a1cc0c not found: ID does not exist" containerID="f1dea227fbc569dd7f930e46363a17cbb1daac2b1d2a0e689ff7bfb2f0a1cc0c" Feb 20 17:03:03 crc kubenswrapper[4697]: I0220 17:03:03.580767 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1dea227fbc569dd7f930e46363a17cbb1daac2b1d2a0e689ff7bfb2f0a1cc0c"} err="failed to get container status \"f1dea227fbc569dd7f930e46363a17cbb1daac2b1d2a0e689ff7bfb2f0a1cc0c\": rpc error: code = NotFound desc = could not find container \"f1dea227fbc569dd7f930e46363a17cbb1daac2b1d2a0e689ff7bfb2f0a1cc0c\": container with ID starting with f1dea227fbc569dd7f930e46363a17cbb1daac2b1d2a0e689ff7bfb2f0a1cc0c not found: ID does not exist" Feb 20 17:03:04 crc kubenswrapper[4697]: I0220 17:03:04.886904 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e38df26d-85b1-4da3-9adc-355d959fb297" path="/var/lib/kubelet/pods/e38df26d-85b1-4da3-9adc-355d959fb297/volumes" Feb 20 17:03:07 crc kubenswrapper[4697]: E0220 17:03:07.707541 4697 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0272cd28_d20c_4ffb_9dc0_dbaa2d92aae9.slice/crio-conmon-0efa9a1a1c9fb1aba3e01e3c59242c31199ac5194a938bcf17295266e075b997.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0272cd28_d20c_4ffb_9dc0_dbaa2d92aae9.slice/crio-0efa9a1a1c9fb1aba3e01e3c59242c31199ac5194a938bcf17295266e075b997.scope\": RecentStats: unable to find data in memory cache]" Feb 20 17:03:08 crc kubenswrapper[4697]: I0220 17:03:08.525606 4697 generic.go:334] "Generic (PLEG): container finished" podID="0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9" containerID="0efa9a1a1c9fb1aba3e01e3c59242c31199ac5194a938bcf17295266e075b997" exitCode=0 Feb 20 17:03:08 crc kubenswrapper[4697]: I0220 17:03:08.525633 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" event={"ID":"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9","Type":"ContainerDied","Data":"0efa9a1a1c9fb1aba3e01e3c59242c31199ac5194a938bcf17295266e075b997"} Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.000248 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.048468 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9crgg\" (UniqueName: \"kubernetes.io/projected/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-kube-api-access-9crgg\") pod \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\" (UID: \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\") " Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.048531 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-ssh-key-openstack-edpm-ipam\") pod \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\" (UID: \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\") " Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.048639 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-inventory\") pod \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\" (UID: \"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9\") " Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.059884 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-kube-api-access-9crgg" (OuterVolumeSpecName: "kube-api-access-9crgg") pod "0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9" (UID: "0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9"). InnerVolumeSpecName "kube-api-access-9crgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.090882 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9" (UID: "0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.093567 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-inventory" (OuterVolumeSpecName: "inventory") pod "0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9" (UID: "0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.151496 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.151984 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9crgg\" (UniqueName: \"kubernetes.io/projected/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-kube-api-access-9crgg\") on node \"crc\" DevicePath \"\"" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.151996 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.546536 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" event={"ID":"0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9","Type":"ContainerDied","Data":"75283b0e07da306b6445c780c3b4dc90f6ea7383015a715e7e7ff603a7b839a0"} Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.546583 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75283b0e07da306b6445c780c3b4dc90f6ea7383015a715e7e7ff603a7b839a0" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.546967 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6mlzs" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.648059 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f"] Feb 20 17:03:10 crc kubenswrapper[4697]: E0220 17:03:10.648459 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38df26d-85b1-4da3-9adc-355d959fb297" containerName="extract-utilities" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.648480 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38df26d-85b1-4da3-9adc-355d959fb297" containerName="extract-utilities" Feb 20 17:03:10 crc kubenswrapper[4697]: E0220 17:03:10.648503 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38df26d-85b1-4da3-9adc-355d959fb297" containerName="registry-server" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.648509 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38df26d-85b1-4da3-9adc-355d959fb297" containerName="registry-server" Feb 20 17:03:10 crc kubenswrapper[4697]: E0220 17:03:10.648524 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38df26d-85b1-4da3-9adc-355d959fb297" containerName="extract-content" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.648531 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38df26d-85b1-4da3-9adc-355d959fb297" containerName="extract-content" Feb 20 17:03:10 crc kubenswrapper[4697]: E0220 17:03:10.648554 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.648562 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.648736 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.648746 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e38df26d-85b1-4da3-9adc-355d959fb297" containerName="registry-server" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.649500 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.652199 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.652568 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.652738 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.652907 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.660968 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f"] Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.763398 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f\" (UID: \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.763526 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46phf\" (UniqueName: \"kubernetes.io/projected/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-kube-api-access-46phf\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f\" (UID: \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.763612 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f\" (UID: \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.865216 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f\" (UID: \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.865372 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f\" (UID: \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.865570 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46phf\" (UniqueName: \"kubernetes.io/projected/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-kube-api-access-46phf\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f\" (UID: \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.869817 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f\" (UID: \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.870488 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f\" (UID: \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:03:10 crc kubenswrapper[4697]: I0220 17:03:10.904842 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46phf\" (UniqueName: \"kubernetes.io/projected/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-kube-api-access-46phf\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f\" (UID: \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:03:11 crc kubenswrapper[4697]: I0220 17:03:11.013329 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:03:11 crc kubenswrapper[4697]: I0220 17:03:11.297481 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:03:11 crc kubenswrapper[4697]: I0220 17:03:11.350580 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:03:11 crc kubenswrapper[4697]: I0220 17:03:11.520924 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f"] Feb 20 17:03:11 crc kubenswrapper[4697]: I0220 17:03:11.530242 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7fcjr"] Feb 20 17:03:11 crc kubenswrapper[4697]: I0220 17:03:11.557473 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" event={"ID":"904a709b-1b1b-46d9-b2cd-d6517ff7ef07","Type":"ContainerStarted","Data":"817597ece15129b724dd21a8ae8ad08f4167c441bbb68ad38b1e7e10eeddede4"} Feb 20 17:03:12 crc kubenswrapper[4697]: I0220 17:03:12.569060 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" event={"ID":"904a709b-1b1b-46d9-b2cd-d6517ff7ef07","Type":"ContainerStarted","Data":"02f27c457c1e19fe575d62c46e8a738cb0914c68447563d4c285db5a83539285"} Feb 20 17:03:12 crc kubenswrapper[4697]: I0220 17:03:12.569212 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7fcjr" podUID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" containerName="registry-server" containerID="cri-o://cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6" gracePeriod=2 Feb 20 17:03:12 crc kubenswrapper[4697]: I0220 17:03:12.585941 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" podStartSLOduration=2.17931989 podStartE2EDuration="2.585919775s" podCreationTimestamp="2026-02-20 17:03:10 +0000 UTC" firstStartedPulling="2026-02-20 17:03:11.527115686 +0000 UTC m=+1899.307161094" lastFinishedPulling="2026-02-20 17:03:11.933715571 +0000 UTC m=+1899.713760979" observedRunningTime="2026-02-20 17:03:12.584090196 +0000 UTC m=+1900.364135614" watchObservedRunningTime="2026-02-20 17:03:12.585919775 +0000 UTC m=+1900.365965183" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.043833 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.111020 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26bc255-664e-4b7c-bfd6-682e1feb0b12-utilities\") pod \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\" (UID: \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\") " Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.111144 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26bc255-664e-4b7c-bfd6-682e1feb0b12-catalog-content\") pod \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\" (UID: \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\") " Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.111174 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5nqm\" (UniqueName: \"kubernetes.io/projected/e26bc255-664e-4b7c-bfd6-682e1feb0b12-kube-api-access-g5nqm\") pod \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\" (UID: \"e26bc255-664e-4b7c-bfd6-682e1feb0b12\") " Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.113090 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e26bc255-664e-4b7c-bfd6-682e1feb0b12-utilities" (OuterVolumeSpecName: "utilities") pod "e26bc255-664e-4b7c-bfd6-682e1feb0b12" (UID: "e26bc255-664e-4b7c-bfd6-682e1feb0b12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.117934 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26bc255-664e-4b7c-bfd6-682e1feb0b12-kube-api-access-g5nqm" (OuterVolumeSpecName: "kube-api-access-g5nqm") pod "e26bc255-664e-4b7c-bfd6-682e1feb0b12" (UID: "e26bc255-664e-4b7c-bfd6-682e1feb0b12"). InnerVolumeSpecName "kube-api-access-g5nqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.212992 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26bc255-664e-4b7c-bfd6-682e1feb0b12-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.213026 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5nqm\" (UniqueName: \"kubernetes.io/projected/e26bc255-664e-4b7c-bfd6-682e1feb0b12-kube-api-access-g5nqm\") on node \"crc\" DevicePath \"\"" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.249893 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e26bc255-664e-4b7c-bfd6-682e1feb0b12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e26bc255-664e-4b7c-bfd6-682e1feb0b12" (UID: "e26bc255-664e-4b7c-bfd6-682e1feb0b12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.315721 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26bc255-664e-4b7c-bfd6-682e1feb0b12-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.579258 4697 generic.go:334] "Generic (PLEG): container finished" podID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" containerID="cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6" exitCode=0 Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.579290 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fcjr" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.579311 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fcjr" event={"ID":"e26bc255-664e-4b7c-bfd6-682e1feb0b12","Type":"ContainerDied","Data":"cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6"} Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.580388 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fcjr" event={"ID":"e26bc255-664e-4b7c-bfd6-682e1feb0b12","Type":"ContainerDied","Data":"9180f78b153b4b1aa6e4937b8993f6d283a4e9b08ff95ea8005765209da9205e"} Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.580414 4697 scope.go:117] "RemoveContainer" containerID="cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.618082 4697 scope.go:117] "RemoveContainer" containerID="a689ec6cf946569c8cba6035d151dd36c98919b9a5395047f9c8cff690de1d20" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.619158 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7fcjr"] Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.645385 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7fcjr"] Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.658473 4697 scope.go:117] "RemoveContainer" containerID="86a2c81194c2b9087defc4d2d42d10c3805d63cd78f0646511ac12d859103cd4" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.696084 4697 scope.go:117] "RemoveContainer" containerID="cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6" Feb 20 17:03:13 crc kubenswrapper[4697]: E0220 17:03:13.696510 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6\": container with ID starting with cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6 not found: ID does not exist" containerID="cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.696542 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6"} err="failed to get container status \"cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6\": rpc error: code = NotFound desc = could not find container \"cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6\": container with ID starting with cf62b80ff3783db1125c81cce21348a390a95b5aa85b77518fcc55813ab5fed6 not found: ID does not exist" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.696562 4697 scope.go:117] "RemoveContainer" containerID="a689ec6cf946569c8cba6035d151dd36c98919b9a5395047f9c8cff690de1d20" Feb 20 17:03:13 crc kubenswrapper[4697]: E0220 17:03:13.696901 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a689ec6cf946569c8cba6035d151dd36c98919b9a5395047f9c8cff690de1d20\": container with ID starting with a689ec6cf946569c8cba6035d151dd36c98919b9a5395047f9c8cff690de1d20 not found: ID does not exist" containerID="a689ec6cf946569c8cba6035d151dd36c98919b9a5395047f9c8cff690de1d20" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.696924 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a689ec6cf946569c8cba6035d151dd36c98919b9a5395047f9c8cff690de1d20"} err="failed to get container status \"a689ec6cf946569c8cba6035d151dd36c98919b9a5395047f9c8cff690de1d20\": rpc error: code = NotFound desc = could not find container \"a689ec6cf946569c8cba6035d151dd36c98919b9a5395047f9c8cff690de1d20\": container with ID starting with a689ec6cf946569c8cba6035d151dd36c98919b9a5395047f9c8cff690de1d20 not found: ID does not exist" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.696939 4697 scope.go:117] "RemoveContainer" containerID="86a2c81194c2b9087defc4d2d42d10c3805d63cd78f0646511ac12d859103cd4" Feb 20 17:03:13 crc kubenswrapper[4697]: E0220 17:03:13.697175 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86a2c81194c2b9087defc4d2d42d10c3805d63cd78f0646511ac12d859103cd4\": container with ID starting with 86a2c81194c2b9087defc4d2d42d10c3805d63cd78f0646511ac12d859103cd4 not found: ID does not exist" containerID="86a2c81194c2b9087defc4d2d42d10c3805d63cd78f0646511ac12d859103cd4" Feb 20 17:03:13 crc kubenswrapper[4697]: I0220 17:03:13.697199 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86a2c81194c2b9087defc4d2d42d10c3805d63cd78f0646511ac12d859103cd4"} err="failed to get container status \"86a2c81194c2b9087defc4d2d42d10c3805d63cd78f0646511ac12d859103cd4\": rpc error: code = NotFound desc = could not find container \"86a2c81194c2b9087defc4d2d42d10c3805d63cd78f0646511ac12d859103cd4\": container with ID starting with 86a2c81194c2b9087defc4d2d42d10c3805d63cd78f0646511ac12d859103cd4 not found: ID does not exist" Feb 20 17:03:14 crc kubenswrapper[4697]: I0220 17:03:14.890605 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" path="/var/lib/kubelet/pods/e26bc255-664e-4b7c-bfd6-682e1feb0b12/volumes" Feb 20 17:03:26 crc kubenswrapper[4697]: I0220 17:03:26.053156 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4l2dn"] Feb 20 17:03:26 crc kubenswrapper[4697]: I0220 17:03:26.064551 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4l2dn"] Feb 20 17:03:26 crc kubenswrapper[4697]: I0220 17:03:26.896147 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25b9be1a-ce8d-40bb-978c-f2213d6175d2" path="/var/lib/kubelet/pods/25b9be1a-ce8d-40bb-978c-f2213d6175d2/volumes" Feb 20 17:03:45 crc kubenswrapper[4697]: I0220 17:03:45.321385 4697 scope.go:117] "RemoveContainer" containerID="a7e9786095ab3c9d0b955842128535d2ce274ccfb0d497ab87627ef4fad4a57c" Feb 20 17:03:45 crc kubenswrapper[4697]: I0220 17:03:45.369677 4697 scope.go:117] "RemoveContainer" containerID="f6e7c4cdd439befc501312edc8b2fc4d4d32159105ca6e34d18a6c535f993564" Feb 20 17:03:45 crc kubenswrapper[4697]: I0220 17:03:45.395425 4697 scope.go:117] "RemoveContainer" containerID="75842581c54de9b4446ae5e3112d2449a92a69c03dc956adf7274152668901a2" Feb 20 17:03:45 crc kubenswrapper[4697]: I0220 17:03:45.440826 4697 scope.go:117] "RemoveContainer" containerID="7b17efc22c4ec86c5200d4bffbdadf08889f0fec0d312fc6ff2a65b8df4b5f9a" Feb 20 17:03:45 crc kubenswrapper[4697]: I0220 17:03:45.484968 4697 scope.go:117] "RemoveContainer" containerID="576e2ecdc40c7372f87b52db3659801e13d4d186023b1c5c2867ddd37ec318be" Feb 20 17:03:45 crc kubenswrapper[4697]: I0220 17:03:45.539317 4697 scope.go:117] "RemoveContainer" containerID="1c9a012e3246ea27666bcbac62224f7dab59619a5367de1a8740b455272625e0" Feb 20 17:03:45 crc kubenswrapper[4697]: I0220 17:03:45.570167 4697 scope.go:117] "RemoveContainer" containerID="fa19488dcacf05279d16a1a14ed686fdea5626f3196354026d783ddadf15c472" Feb 20 17:03:50 crc kubenswrapper[4697]: I0220 17:03:50.045924 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-nmwj6"] Feb 20 17:03:50 crc kubenswrapper[4697]: I0220 17:03:50.054749 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-nmwj6"] Feb 20 17:03:50 crc kubenswrapper[4697]: I0220 17:03:50.887216 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eebb045-e1b7-4b9d-9d41-9485997d1743" path="/var/lib/kubelet/pods/7eebb045-e1b7-4b9d-9d41-9485997d1743/volumes" Feb 20 17:03:55 crc kubenswrapper[4697]: I0220 17:03:55.041720 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6bjzj"] Feb 20 17:03:55 crc kubenswrapper[4697]: I0220 17:03:55.063201 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6bjzj"] Feb 20 17:03:56 crc kubenswrapper[4697]: I0220 17:03:56.897818 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33dd1dad-ee40-4520-9d04-56a1b69894a0" path="/var/lib/kubelet/pods/33dd1dad-ee40-4520-9d04-56a1b69894a0/volumes" Feb 20 17:04:01 crc kubenswrapper[4697]: I0220 17:04:01.185417 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:04:01 crc kubenswrapper[4697]: I0220 17:04:01.187050 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:04:04 crc kubenswrapper[4697]: I0220 17:04:04.207153 4697 generic.go:334] "Generic (PLEG): container finished" podID="904a709b-1b1b-46d9-b2cd-d6517ff7ef07" containerID="02f27c457c1e19fe575d62c46e8a738cb0914c68447563d4c285db5a83539285" exitCode=0 Feb 20 17:04:04 crc kubenswrapper[4697]: I0220 17:04:04.207226 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" event={"ID":"904a709b-1b1b-46d9-b2cd-d6517ff7ef07","Type":"ContainerDied","Data":"02f27c457c1e19fe575d62c46e8a738cb0914c68447563d4c285db5a83539285"} Feb 20 17:04:05 crc kubenswrapper[4697]: I0220 17:04:05.661545 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:04:05 crc kubenswrapper[4697]: I0220 17:04:05.790822 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46phf\" (UniqueName: \"kubernetes.io/projected/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-kube-api-access-46phf\") pod \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\" (UID: \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\") " Feb 20 17:04:05 crc kubenswrapper[4697]: I0220 17:04:05.790911 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-ssh-key-openstack-edpm-ipam\") pod \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\" (UID: \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\") " Feb 20 17:04:05 crc kubenswrapper[4697]: I0220 17:04:05.791116 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-inventory\") pod \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\" (UID: \"904a709b-1b1b-46d9-b2cd-d6517ff7ef07\") " Feb 20 17:04:05 crc kubenswrapper[4697]: I0220 17:04:05.798307 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-kube-api-access-46phf" (OuterVolumeSpecName: "kube-api-access-46phf") pod "904a709b-1b1b-46d9-b2cd-d6517ff7ef07" (UID: "904a709b-1b1b-46d9-b2cd-d6517ff7ef07"). InnerVolumeSpecName "kube-api-access-46phf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:04:05 crc kubenswrapper[4697]: I0220 17:04:05.818617 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-inventory" (OuterVolumeSpecName: "inventory") pod "904a709b-1b1b-46d9-b2cd-d6517ff7ef07" (UID: "904a709b-1b1b-46d9-b2cd-d6517ff7ef07"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:04:05 crc kubenswrapper[4697]: I0220 17:04:05.858526 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "904a709b-1b1b-46d9-b2cd-d6517ff7ef07" (UID: "904a709b-1b1b-46d9-b2cd-d6517ff7ef07"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:04:05 crc kubenswrapper[4697]: I0220 17:04:05.893520 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46phf\" (UniqueName: \"kubernetes.io/projected/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-kube-api-access-46phf\") on node \"crc\" DevicePath \"\"" Feb 20 17:04:05 crc kubenswrapper[4697]: I0220 17:04:05.893549 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:04:05 crc kubenswrapper[4697]: I0220 17:04:05.893563 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/904a709b-1b1b-46d9-b2cd-d6517ff7ef07-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.232790 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" event={"ID":"904a709b-1b1b-46d9-b2cd-d6517ff7ef07","Type":"ContainerDied","Data":"817597ece15129b724dd21a8ae8ad08f4167c441bbb68ad38b1e7e10eeddede4"} Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.233156 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="817597ece15129b724dd21a8ae8ad08f4167c441bbb68ad38b1e7e10eeddede4" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.232853 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.336755 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-57pvl"] Feb 20 17:04:06 crc kubenswrapper[4697]: E0220 17:04:06.337211 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" containerName="registry-server" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.337232 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" containerName="registry-server" Feb 20 17:04:06 crc kubenswrapper[4697]: E0220 17:04:06.337267 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" containerName="extract-utilities" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.337278 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" containerName="extract-utilities" Feb 20 17:04:06 crc kubenswrapper[4697]: E0220 17:04:06.337306 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="904a709b-1b1b-46d9-b2cd-d6517ff7ef07" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.337315 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="904a709b-1b1b-46d9-b2cd-d6517ff7ef07" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 20 17:04:06 crc kubenswrapper[4697]: E0220 17:04:06.337327 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" containerName="extract-content" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.337334 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" containerName="extract-content" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.337678 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="904a709b-1b1b-46d9-b2cd-d6517ff7ef07" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.337713 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26bc255-664e-4b7c-bfd6-682e1feb0b12" containerName="registry-server" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.338516 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.344083 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.344115 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.347114 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.347589 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.357229 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-57pvl"] Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.403109 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-57pvl\" (UID: \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.403215 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-57pvl\" (UID: \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.403247 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg2rd\" (UniqueName: \"kubernetes.io/projected/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-kube-api-access-gg2rd\") pod \"ssh-known-hosts-edpm-deployment-57pvl\" (UID: \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.504150 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-57pvl\" (UID: \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.504996 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-57pvl\" (UID: \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.505030 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg2rd\" (UniqueName: \"kubernetes.io/projected/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-kube-api-access-gg2rd\") pod \"ssh-known-hosts-edpm-deployment-57pvl\" (UID: \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.510059 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-57pvl\" (UID: \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.512049 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-57pvl\" (UID: \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.523564 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg2rd\" (UniqueName: \"kubernetes.io/projected/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-kube-api-access-gg2rd\") pod \"ssh-known-hosts-edpm-deployment-57pvl\" (UID: \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:06 crc kubenswrapper[4697]: I0220 17:04:06.656517 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:07 crc kubenswrapper[4697]: I0220 17:04:07.196599 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-57pvl"] Feb 20 17:04:07 crc kubenswrapper[4697]: I0220 17:04:07.241591 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" event={"ID":"ab058f87-f768-4d3a-b7cf-a39feef5c5f6","Type":"ContainerStarted","Data":"1819885be891e6a1dfc83bd444d2aa38abe21e3b70c5348c60d09683225dba7b"} Feb 20 17:04:08 crc kubenswrapper[4697]: I0220 17:04:08.250962 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" event={"ID":"ab058f87-f768-4d3a-b7cf-a39feef5c5f6","Type":"ContainerStarted","Data":"0434b7219406d234d2d5cf87fcb806f7620a0cd111893dea53a210a01bf1f0d7"} Feb 20 17:04:08 crc kubenswrapper[4697]: I0220 17:04:08.275800 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" podStartSLOduration=1.833781063 podStartE2EDuration="2.275774812s" podCreationTimestamp="2026-02-20 17:04:06 +0000 UTC" firstStartedPulling="2026-02-20 17:04:07.204332335 +0000 UTC m=+1954.984377733" lastFinishedPulling="2026-02-20 17:04:07.646326084 +0000 UTC m=+1955.426371482" observedRunningTime="2026-02-20 17:04:08.270516282 +0000 UTC m=+1956.050561720" watchObservedRunningTime="2026-02-20 17:04:08.275774812 +0000 UTC m=+1956.055820220" Feb 20 17:04:15 crc kubenswrapper[4697]: I0220 17:04:15.312388 4697 generic.go:334] "Generic (PLEG): container finished" podID="ab058f87-f768-4d3a-b7cf-a39feef5c5f6" containerID="0434b7219406d234d2d5cf87fcb806f7620a0cd111893dea53a210a01bf1f0d7" exitCode=0 Feb 20 17:04:15 crc kubenswrapper[4697]: I0220 17:04:15.313202 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" event={"ID":"ab058f87-f768-4d3a-b7cf-a39feef5c5f6","Type":"ContainerDied","Data":"0434b7219406d234d2d5cf87fcb806f7620a0cd111893dea53a210a01bf1f0d7"} Feb 20 17:04:16 crc kubenswrapper[4697]: I0220 17:04:16.829679 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.008648 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-ssh-key-openstack-edpm-ipam\") pod \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\" (UID: \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\") " Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.008800 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg2rd\" (UniqueName: \"kubernetes.io/projected/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-kube-api-access-gg2rd\") pod \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\" (UID: \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\") " Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.008849 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-inventory-0\") pod \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\" (UID: \"ab058f87-f768-4d3a-b7cf-a39feef5c5f6\") " Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.022348 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-kube-api-access-gg2rd" (OuterVolumeSpecName: "kube-api-access-gg2rd") pod "ab058f87-f768-4d3a-b7cf-a39feef5c5f6" (UID: "ab058f87-f768-4d3a-b7cf-a39feef5c5f6"). InnerVolumeSpecName "kube-api-access-gg2rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.036982 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ab058f87-f768-4d3a-b7cf-a39feef5c5f6" (UID: "ab058f87-f768-4d3a-b7cf-a39feef5c5f6"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.046204 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ab058f87-f768-4d3a-b7cf-a39feef5c5f6" (UID: "ab058f87-f768-4d3a-b7cf-a39feef5c5f6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.110786 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.111135 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg2rd\" (UniqueName: \"kubernetes.io/projected/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-kube-api-access-gg2rd\") on node \"crc\" DevicePath \"\"" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.111151 4697 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ab058f87-f768-4d3a-b7cf-a39feef5c5f6-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.335182 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" event={"ID":"ab058f87-f768-4d3a-b7cf-a39feef5c5f6","Type":"ContainerDied","Data":"1819885be891e6a1dfc83bd444d2aa38abe21e3b70c5348c60d09683225dba7b"} Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.335252 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1819885be891e6a1dfc83bd444d2aa38abe21e3b70c5348c60d09683225dba7b" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.335284 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-57pvl" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.415838 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6"] Feb 20 17:04:17 crc kubenswrapper[4697]: E0220 17:04:17.421837 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab058f87-f768-4d3a-b7cf-a39feef5c5f6" containerName="ssh-known-hosts-edpm-deployment" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.421875 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab058f87-f768-4d3a-b7cf-a39feef5c5f6" containerName="ssh-known-hosts-edpm-deployment" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.422483 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab058f87-f768-4d3a-b7cf-a39feef5c5f6" containerName="ssh-known-hosts-edpm-deployment" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.423854 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.427220 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.427774 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.431422 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.432760 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.442199 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6"] Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.627296 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0700858d-9b11-4cca-a80c-143da84eea6e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7tfq6\" (UID: \"0700858d-9b11-4cca-a80c-143da84eea6e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.627372 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv9gm\" (UniqueName: \"kubernetes.io/projected/0700858d-9b11-4cca-a80c-143da84eea6e-kube-api-access-jv9gm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7tfq6\" (UID: \"0700858d-9b11-4cca-a80c-143da84eea6e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.627723 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0700858d-9b11-4cca-a80c-143da84eea6e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7tfq6\" (UID: \"0700858d-9b11-4cca-a80c-143da84eea6e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.730352 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0700858d-9b11-4cca-a80c-143da84eea6e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7tfq6\" (UID: \"0700858d-9b11-4cca-a80c-143da84eea6e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.730422 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv9gm\" (UniqueName: \"kubernetes.io/projected/0700858d-9b11-4cca-a80c-143da84eea6e-kube-api-access-jv9gm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7tfq6\" (UID: \"0700858d-9b11-4cca-a80c-143da84eea6e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.730493 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0700858d-9b11-4cca-a80c-143da84eea6e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7tfq6\" (UID: \"0700858d-9b11-4cca-a80c-143da84eea6e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.734990 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0700858d-9b11-4cca-a80c-143da84eea6e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7tfq6\" (UID: \"0700858d-9b11-4cca-a80c-143da84eea6e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.737509 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0700858d-9b11-4cca-a80c-143da84eea6e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7tfq6\" (UID: \"0700858d-9b11-4cca-a80c-143da84eea6e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.753061 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv9gm\" (UniqueName: \"kubernetes.io/projected/0700858d-9b11-4cca-a80c-143da84eea6e-kube-api-access-jv9gm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7tfq6\" (UID: \"0700858d-9b11-4cca-a80c-143da84eea6e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:17 crc kubenswrapper[4697]: I0220 17:04:17.755077 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:18 crc kubenswrapper[4697]: I0220 17:04:18.263243 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6"] Feb 20 17:04:18 crc kubenswrapper[4697]: I0220 17:04:18.343962 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" event={"ID":"0700858d-9b11-4cca-a80c-143da84eea6e","Type":"ContainerStarted","Data":"17a6ba77e2af2e8cd137f29d988e99a3ce7abc6a7b9e74b4b50a5eb3399fdd81"} Feb 20 17:04:19 crc kubenswrapper[4697]: I0220 17:04:19.352736 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" event={"ID":"0700858d-9b11-4cca-a80c-143da84eea6e","Type":"ContainerStarted","Data":"d50c1211cdaf735c76e5c3adc9279629202065ed7195124fdda0b80f38d5f804"} Feb 20 17:04:19 crc kubenswrapper[4697]: I0220 17:04:19.371035 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" podStartSLOduration=1.890629381 podStartE2EDuration="2.371013424s" podCreationTimestamp="2026-02-20 17:04:17 +0000 UTC" firstStartedPulling="2026-02-20 17:04:18.271417456 +0000 UTC m=+1966.051462874" lastFinishedPulling="2026-02-20 17:04:18.751801509 +0000 UTC m=+1966.531846917" observedRunningTime="2026-02-20 17:04:19.36561782 +0000 UTC m=+1967.145663228" watchObservedRunningTime="2026-02-20 17:04:19.371013424 +0000 UTC m=+1967.151058832" Feb 20 17:04:27 crc kubenswrapper[4697]: I0220 17:04:27.435313 4697 generic.go:334] "Generic (PLEG): container finished" podID="0700858d-9b11-4cca-a80c-143da84eea6e" containerID="d50c1211cdaf735c76e5c3adc9279629202065ed7195124fdda0b80f38d5f804" exitCode=0 Feb 20 17:04:27 crc kubenswrapper[4697]: I0220 17:04:27.435531 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" event={"ID":"0700858d-9b11-4cca-a80c-143da84eea6e","Type":"ContainerDied","Data":"d50c1211cdaf735c76e5c3adc9279629202065ed7195124fdda0b80f38d5f804"} Feb 20 17:04:28 crc kubenswrapper[4697]: I0220 17:04:28.988367 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.152369 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0700858d-9b11-4cca-a80c-143da84eea6e-ssh-key-openstack-edpm-ipam\") pod \"0700858d-9b11-4cca-a80c-143da84eea6e\" (UID: \"0700858d-9b11-4cca-a80c-143da84eea6e\") " Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.152413 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0700858d-9b11-4cca-a80c-143da84eea6e-inventory\") pod \"0700858d-9b11-4cca-a80c-143da84eea6e\" (UID: \"0700858d-9b11-4cca-a80c-143da84eea6e\") " Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.152513 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv9gm\" (UniqueName: \"kubernetes.io/projected/0700858d-9b11-4cca-a80c-143da84eea6e-kube-api-access-jv9gm\") pod \"0700858d-9b11-4cca-a80c-143da84eea6e\" (UID: \"0700858d-9b11-4cca-a80c-143da84eea6e\") " Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.158614 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0700858d-9b11-4cca-a80c-143da84eea6e-kube-api-access-jv9gm" (OuterVolumeSpecName: "kube-api-access-jv9gm") pod "0700858d-9b11-4cca-a80c-143da84eea6e" (UID: "0700858d-9b11-4cca-a80c-143da84eea6e"). InnerVolumeSpecName "kube-api-access-jv9gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.181560 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0700858d-9b11-4cca-a80c-143da84eea6e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0700858d-9b11-4cca-a80c-143da84eea6e" (UID: "0700858d-9b11-4cca-a80c-143da84eea6e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.182006 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0700858d-9b11-4cca-a80c-143da84eea6e-inventory" (OuterVolumeSpecName: "inventory") pod "0700858d-9b11-4cca-a80c-143da84eea6e" (UID: "0700858d-9b11-4cca-a80c-143da84eea6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.254582 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0700858d-9b11-4cca-a80c-143da84eea6e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.254617 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0700858d-9b11-4cca-a80c-143da84eea6e-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.254630 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv9gm\" (UniqueName: \"kubernetes.io/projected/0700858d-9b11-4cca-a80c-143da84eea6e-kube-api-access-jv9gm\") on node \"crc\" DevicePath \"\"" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.454393 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" event={"ID":"0700858d-9b11-4cca-a80c-143da84eea6e","Type":"ContainerDied","Data":"17a6ba77e2af2e8cd137f29d988e99a3ce7abc6a7b9e74b4b50a5eb3399fdd81"} Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.454466 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17a6ba77e2af2e8cd137f29d988e99a3ce7abc6a7b9e74b4b50a5eb3399fdd81" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.454700 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7tfq6" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.632911 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk"] Feb 20 17:04:29 crc kubenswrapper[4697]: E0220 17:04:29.633641 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0700858d-9b11-4cca-a80c-143da84eea6e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.633660 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="0700858d-9b11-4cca-a80c-143da84eea6e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.633916 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="0700858d-9b11-4cca-a80c-143da84eea6e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.634697 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.637979 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.638030 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.638106 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.638239 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.645005 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk"] Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.662247 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kttsg\" (UniqueName: \"kubernetes.io/projected/eace25c0-d234-43c5-88a0-f8ba1fc78dac-kube-api-access-kttsg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk\" (UID: \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.662306 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eace25c0-d234-43c5-88a0-f8ba1fc78dac-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk\" (UID: \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.662495 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eace25c0-d234-43c5-88a0-f8ba1fc78dac-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk\" (UID: \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.763998 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kttsg\" (UniqueName: \"kubernetes.io/projected/eace25c0-d234-43c5-88a0-f8ba1fc78dac-kube-api-access-kttsg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk\" (UID: \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.764077 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eace25c0-d234-43c5-88a0-f8ba1fc78dac-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk\" (UID: \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.764390 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eace25c0-d234-43c5-88a0-f8ba1fc78dac-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk\" (UID: \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.775371 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eace25c0-d234-43c5-88a0-f8ba1fc78dac-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk\" (UID: \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.775385 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eace25c0-d234-43c5-88a0-f8ba1fc78dac-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk\" (UID: \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.786024 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kttsg\" (UniqueName: \"kubernetes.io/projected/eace25c0-d234-43c5-88a0-f8ba1fc78dac-kube-api-access-kttsg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk\" (UID: \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:29 crc kubenswrapper[4697]: I0220 17:04:29.957730 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:30 crc kubenswrapper[4697]: I0220 17:04:30.527081 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk"] Feb 20 17:04:30 crc kubenswrapper[4697]: W0220 17:04:30.530251 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeace25c0_d234_43c5_88a0_f8ba1fc78dac.slice/crio-4d1afcb4adcae0b0a3a4f75fd2c09086eab78ff1898e03daada0001f69ec4be4 WatchSource:0}: Error finding container 4d1afcb4adcae0b0a3a4f75fd2c09086eab78ff1898e03daada0001f69ec4be4: Status 404 returned error can't find the container with id 4d1afcb4adcae0b0a3a4f75fd2c09086eab78ff1898e03daada0001f69ec4be4 Feb 20 17:04:31 crc kubenswrapper[4697]: I0220 17:04:31.184739 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:04:31 crc kubenswrapper[4697]: I0220 17:04:31.185017 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:04:31 crc kubenswrapper[4697]: I0220 17:04:31.474580 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" event={"ID":"eace25c0-d234-43c5-88a0-f8ba1fc78dac","Type":"ContainerStarted","Data":"bdf00449e10d94bdb5122589e2fd51fbb39bbf08e06e73b2e27a11f2458f54f5"} Feb 20 17:04:31 crc kubenswrapper[4697]: I0220 17:04:31.474901 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" event={"ID":"eace25c0-d234-43c5-88a0-f8ba1fc78dac","Type":"ContainerStarted","Data":"4d1afcb4adcae0b0a3a4f75fd2c09086eab78ff1898e03daada0001f69ec4be4"} Feb 20 17:04:31 crc kubenswrapper[4697]: I0220 17:04:31.499740 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" podStartSLOduration=2.101278682 podStartE2EDuration="2.499717269s" podCreationTimestamp="2026-02-20 17:04:29 +0000 UTC" firstStartedPulling="2026-02-20 17:04:30.53352291 +0000 UTC m=+1978.313568358" lastFinishedPulling="2026-02-20 17:04:30.931961537 +0000 UTC m=+1978.712006945" observedRunningTime="2026-02-20 17:04:31.496735699 +0000 UTC m=+1979.276781107" watchObservedRunningTime="2026-02-20 17:04:31.499717269 +0000 UTC m=+1979.279762677" Feb 20 17:04:33 crc kubenswrapper[4697]: I0220 17:04:33.059201 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-rhzdd"] Feb 20 17:04:33 crc kubenswrapper[4697]: I0220 17:04:33.075571 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-rhzdd"] Feb 20 17:04:34 crc kubenswrapper[4697]: I0220 17:04:34.887222 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e29df702-9c2a-43c8-bfa2-d681e5f0286b" path="/var/lib/kubelet/pods/e29df702-9c2a-43c8-bfa2-d681e5f0286b/volumes" Feb 20 17:04:40 crc kubenswrapper[4697]: I0220 17:04:40.549641 4697 generic.go:334] "Generic (PLEG): container finished" podID="eace25c0-d234-43c5-88a0-f8ba1fc78dac" containerID="bdf00449e10d94bdb5122589e2fd51fbb39bbf08e06e73b2e27a11f2458f54f5" exitCode=0 Feb 20 17:04:40 crc kubenswrapper[4697]: I0220 17:04:40.549717 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" event={"ID":"eace25c0-d234-43c5-88a0-f8ba1fc78dac","Type":"ContainerDied","Data":"bdf00449e10d94bdb5122589e2fd51fbb39bbf08e06e73b2e27a11f2458f54f5"} Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.019921 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.210691 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kttsg\" (UniqueName: \"kubernetes.io/projected/eace25c0-d234-43c5-88a0-f8ba1fc78dac-kube-api-access-kttsg\") pod \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\" (UID: \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\") " Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.210795 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eace25c0-d234-43c5-88a0-f8ba1fc78dac-ssh-key-openstack-edpm-ipam\") pod \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\" (UID: \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\") " Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.210820 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eace25c0-d234-43c5-88a0-f8ba1fc78dac-inventory\") pod \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\" (UID: \"eace25c0-d234-43c5-88a0-f8ba1fc78dac\") " Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.220850 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eace25c0-d234-43c5-88a0-f8ba1fc78dac-kube-api-access-kttsg" (OuterVolumeSpecName: "kube-api-access-kttsg") pod "eace25c0-d234-43c5-88a0-f8ba1fc78dac" (UID: "eace25c0-d234-43c5-88a0-f8ba1fc78dac"). InnerVolumeSpecName "kube-api-access-kttsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.238962 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eace25c0-d234-43c5-88a0-f8ba1fc78dac-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eace25c0-d234-43c5-88a0-f8ba1fc78dac" (UID: "eace25c0-d234-43c5-88a0-f8ba1fc78dac"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.244068 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eace25c0-d234-43c5-88a0-f8ba1fc78dac-inventory" (OuterVolumeSpecName: "inventory") pod "eace25c0-d234-43c5-88a0-f8ba1fc78dac" (UID: "eace25c0-d234-43c5-88a0-f8ba1fc78dac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.314034 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kttsg\" (UniqueName: \"kubernetes.io/projected/eace25c0-d234-43c5-88a0-f8ba1fc78dac-kube-api-access-kttsg\") on node \"crc\" DevicePath \"\"" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.314794 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eace25c0-d234-43c5-88a0-f8ba1fc78dac-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.314812 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eace25c0-d234-43c5-88a0-f8ba1fc78dac-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.591956 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" event={"ID":"eace25c0-d234-43c5-88a0-f8ba1fc78dac","Type":"ContainerDied","Data":"4d1afcb4adcae0b0a3a4f75fd2c09086eab78ff1898e03daada0001f69ec4be4"} Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.592004 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d1afcb4adcae0b0a3a4f75fd2c09086eab78ff1898e03daada0001f69ec4be4" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.592046 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.667031 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq"] Feb 20 17:04:42 crc kubenswrapper[4697]: E0220 17:04:42.667473 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eace25c0-d234-43c5-88a0-f8ba1fc78dac" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.667491 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="eace25c0-d234-43c5-88a0-f8ba1fc78dac" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.667704 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="eace25c0-d234-43c5-88a0-f8ba1fc78dac" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.668610 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.671513 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.671671 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.671809 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.671922 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.672028 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.672226 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.674326 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.675509 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.679654 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq"] Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824043 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh4zg\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-kube-api-access-sh4zg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824084 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824170 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824198 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824253 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824308 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824330 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824356 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824383 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824458 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824479 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824499 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824624 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.824673 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926460 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926540 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926562 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926583 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926619 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926650 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926681 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh4zg\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-kube-api-access-sh4zg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926701 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926747 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926773 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926801 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926862 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926880 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.926910 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.930950 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.931130 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.931957 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.932121 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.932865 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.933601 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.933973 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.934227 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.934754 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.938291 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.941109 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.942589 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.946100 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.946870 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh4zg\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-kube-api-access-sh4zg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:42 crc kubenswrapper[4697]: I0220 17:04:42.984987 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:04:43 crc kubenswrapper[4697]: I0220 17:04:43.325118 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq"] Feb 20 17:04:43 crc kubenswrapper[4697]: I0220 17:04:43.337895 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 17:04:43 crc kubenswrapper[4697]: I0220 17:04:43.605014 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" event={"ID":"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f","Type":"ContainerStarted","Data":"862ae027f0c11580bd57a1509a2a51d65e175568a344e24138b252026089ced5"} Feb 20 17:04:44 crc kubenswrapper[4697]: I0220 17:04:44.617303 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" event={"ID":"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f","Type":"ContainerStarted","Data":"12ea7ba7d43c1e05419e90f7ff1a499d0cafba5cfe63a0f6e953abde2094e432"} Feb 20 17:04:44 crc kubenswrapper[4697]: I0220 17:04:44.644775 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" podStartSLOduration=2.239800179 podStartE2EDuration="2.644746879s" podCreationTimestamp="2026-02-20 17:04:42 +0000 UTC" firstStartedPulling="2026-02-20 17:04:43.337696339 +0000 UTC m=+1991.117741747" lastFinishedPulling="2026-02-20 17:04:43.742643029 +0000 UTC m=+1991.522688447" observedRunningTime="2026-02-20 17:04:44.638262586 +0000 UTC m=+1992.418307994" watchObservedRunningTime="2026-02-20 17:04:44.644746879 +0000 UTC m=+1992.424792327" Feb 20 17:04:45 crc kubenswrapper[4697]: I0220 17:04:45.731279 4697 scope.go:117] "RemoveContainer" containerID="c399bc50be840adc92925f1fc55368c82f4b637e286e9cdec9a2cb7fe50d227c" Feb 20 17:04:45 crc kubenswrapper[4697]: I0220 17:04:45.765468 4697 scope.go:117] "RemoveContainer" containerID="1e28fbf0ae4c050426541d75380fcd78c64c5fbd2c2607160ec0e5a8313c38ea" Feb 20 17:04:45 crc kubenswrapper[4697]: I0220 17:04:45.814731 4697 scope.go:117] "RemoveContainer" containerID="2fab0bf3cef5f45dbbf1a0d28d2e66edcfc757a847d2de5005c363a251a621c0" Feb 20 17:05:01 crc kubenswrapper[4697]: I0220 17:05:01.184890 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:05:01 crc kubenswrapper[4697]: I0220 17:05:01.185637 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:05:01 crc kubenswrapper[4697]: I0220 17:05:01.185704 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 17:05:01 crc kubenswrapper[4697]: I0220 17:05:01.186781 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ab741227b7557de30c7c29e950e092010c8b169886e816c84c4df58e588aa0d"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 17:05:01 crc kubenswrapper[4697]: I0220 17:05:01.186867 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://5ab741227b7557de30c7c29e950e092010c8b169886e816c84c4df58e588aa0d" gracePeriod=600 Feb 20 17:05:01 crc kubenswrapper[4697]: I0220 17:05:01.791250 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="5ab741227b7557de30c7c29e950e092010c8b169886e816c84c4df58e588aa0d" exitCode=0 Feb 20 17:05:01 crc kubenswrapper[4697]: I0220 17:05:01.791319 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"5ab741227b7557de30c7c29e950e092010c8b169886e816c84c4df58e588aa0d"} Feb 20 17:05:01 crc kubenswrapper[4697]: I0220 17:05:01.791792 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3"} Feb 20 17:05:01 crc kubenswrapper[4697]: I0220 17:05:01.791824 4697 scope.go:117] "RemoveContainer" containerID="b276cf714b86891cb62ef7688d6d7253d15b6a3e50b49f582bed524d93236382" Feb 20 17:05:20 crc kubenswrapper[4697]: I0220 17:05:20.968493 4697 generic.go:334] "Generic (PLEG): container finished" podID="04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" containerID="12ea7ba7d43c1e05419e90f7ff1a499d0cafba5cfe63a0f6e953abde2094e432" exitCode=0 Feb 20 17:05:20 crc kubenswrapper[4697]: I0220 17:05:20.968574 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" event={"ID":"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f","Type":"ContainerDied","Data":"12ea7ba7d43c1e05419e90f7ff1a499d0cafba5cfe63a0f6e953abde2094e432"} Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.472104 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614004 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614062 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614133 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-bootstrap-combined-ca-bundle\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614154 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-neutron-metadata-combined-ca-bundle\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614185 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-inventory\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614260 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-ssh-key-openstack-edpm-ipam\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614283 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614379 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh4zg\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-kube-api-access-sh4zg\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614413 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-ovn-combined-ca-bundle\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614474 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-libvirt-combined-ca-bundle\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614538 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-telemetry-combined-ca-bundle\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614620 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.614698 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-repo-setup-combined-ca-bundle\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.615846 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-nova-combined-ca-bundle\") pod \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\" (UID: \"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f\") " Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.624012 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.624082 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.624086 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.625341 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.625402 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.625490 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.625559 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.625643 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.625821 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-kube-api-access-sh4zg" (OuterVolumeSpecName: "kube-api-access-sh4zg") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "kube-api-access-sh4zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.625945 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.626090 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.627827 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.649279 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-inventory" (OuterVolumeSpecName: "inventory") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.651511 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" (UID: "04c7e6bd-f464-42ea-aa0b-a4b47a169d6f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718129 4697 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718184 4697 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718197 4697 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718207 4697 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718216 4697 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718226 4697 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718234 4697 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718263 4697 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718273 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718281 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718290 4697 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718298 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh4zg\" (UniqueName: \"kubernetes.io/projected/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-kube-api-access-sh4zg\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718346 4697 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.718355 4697 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c7e6bd-f464-42ea-aa0b-a4b47a169d6f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.986942 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" event={"ID":"04c7e6bd-f464-42ea-aa0b-a4b47a169d6f","Type":"ContainerDied","Data":"862ae027f0c11580bd57a1509a2a51d65e175568a344e24138b252026089ced5"} Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.986969 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq" Feb 20 17:05:22 crc kubenswrapper[4697]: I0220 17:05:22.986976 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="862ae027f0c11580bd57a1509a2a51d65e175568a344e24138b252026089ced5" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.101073 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7"] Feb 20 17:05:23 crc kubenswrapper[4697]: E0220 17:05:23.101696 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.101770 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.102049 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="04c7e6bd-f464-42ea-aa0b-a4b47a169d6f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.103874 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.106205 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.106243 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.106829 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.106916 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.107133 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.113155 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7"] Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.230704 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.230882 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b1aebd55-3d79-403b-978d-04afedd25c3d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.230955 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq7gb\" (UniqueName: \"kubernetes.io/projected/b1aebd55-3d79-403b-978d-04afedd25c3d-kube-api-access-hq7gb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.230977 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.231048 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.332673 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b1aebd55-3d79-403b-978d-04afedd25c3d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.332750 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq7gb\" (UniqueName: \"kubernetes.io/projected/b1aebd55-3d79-403b-978d-04afedd25c3d-kube-api-access-hq7gb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.332771 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.332820 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.332870 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.335531 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b1aebd55-3d79-403b-978d-04afedd25c3d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.337765 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.338168 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.346051 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.349019 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq7gb\" (UniqueName: \"kubernetes.io/projected/b1aebd55-3d79-403b-978d-04afedd25c3d-kube-api-access-hq7gb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-lc2h7\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.438393 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.951405 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7"] Feb 20 17:05:23 crc kubenswrapper[4697]: I0220 17:05:23.995697 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" event={"ID":"b1aebd55-3d79-403b-978d-04afedd25c3d","Type":"ContainerStarted","Data":"edda3c1be0c9afa0e3c295ec07ca5119edaf1805e6ebd938ee76a8348600ff46"} Feb 20 17:05:25 crc kubenswrapper[4697]: I0220 17:05:25.005854 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" event={"ID":"b1aebd55-3d79-403b-978d-04afedd25c3d","Type":"ContainerStarted","Data":"0765838de471a397e32a4ed6173c17a1e877197a48b0805cf057904be6fd9e14"} Feb 20 17:05:25 crc kubenswrapper[4697]: I0220 17:05:25.028653 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" podStartSLOduration=1.640525507 podStartE2EDuration="2.028614357s" podCreationTimestamp="2026-02-20 17:05:23 +0000 UTC" firstStartedPulling="2026-02-20 17:05:23.953288893 +0000 UTC m=+2031.733334291" lastFinishedPulling="2026-02-20 17:05:24.341377733 +0000 UTC m=+2032.121423141" observedRunningTime="2026-02-20 17:05:25.02461237 +0000 UTC m=+2032.804657778" watchObservedRunningTime="2026-02-20 17:05:25.028614357 +0000 UTC m=+2032.808659765" Feb 20 17:05:28 crc kubenswrapper[4697]: I0220 17:05:28.940052 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sl6p2"] Feb 20 17:05:28 crc kubenswrapper[4697]: I0220 17:05:28.943050 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:28 crc kubenswrapper[4697]: I0220 17:05:28.953120 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sl6p2"] Feb 20 17:05:29 crc kubenswrapper[4697]: I0220 17:05:29.048668 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54xkl\" (UniqueName: \"kubernetes.io/projected/0226ebe1-35af-42cb-84fb-acf13171680b-kube-api-access-54xkl\") pod \"community-operators-sl6p2\" (UID: \"0226ebe1-35af-42cb-84fb-acf13171680b\") " pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:29 crc kubenswrapper[4697]: I0220 17:05:29.049235 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0226ebe1-35af-42cb-84fb-acf13171680b-catalog-content\") pod \"community-operators-sl6p2\" (UID: \"0226ebe1-35af-42cb-84fb-acf13171680b\") " pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:29 crc kubenswrapper[4697]: I0220 17:05:29.049559 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0226ebe1-35af-42cb-84fb-acf13171680b-utilities\") pod \"community-operators-sl6p2\" (UID: \"0226ebe1-35af-42cb-84fb-acf13171680b\") " pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:29 crc kubenswrapper[4697]: I0220 17:05:29.151551 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0226ebe1-35af-42cb-84fb-acf13171680b-catalog-content\") pod \"community-operators-sl6p2\" (UID: \"0226ebe1-35af-42cb-84fb-acf13171680b\") " pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:29 crc kubenswrapper[4697]: I0220 17:05:29.151598 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0226ebe1-35af-42cb-84fb-acf13171680b-utilities\") pod \"community-operators-sl6p2\" (UID: \"0226ebe1-35af-42cb-84fb-acf13171680b\") " pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:29 crc kubenswrapper[4697]: I0220 17:05:29.151703 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54xkl\" (UniqueName: \"kubernetes.io/projected/0226ebe1-35af-42cb-84fb-acf13171680b-kube-api-access-54xkl\") pod \"community-operators-sl6p2\" (UID: \"0226ebe1-35af-42cb-84fb-acf13171680b\") " pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:29 crc kubenswrapper[4697]: I0220 17:05:29.152109 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0226ebe1-35af-42cb-84fb-acf13171680b-catalog-content\") pod \"community-operators-sl6p2\" (UID: \"0226ebe1-35af-42cb-84fb-acf13171680b\") " pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:29 crc kubenswrapper[4697]: I0220 17:05:29.152292 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0226ebe1-35af-42cb-84fb-acf13171680b-utilities\") pod \"community-operators-sl6p2\" (UID: \"0226ebe1-35af-42cb-84fb-acf13171680b\") " pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:29 crc kubenswrapper[4697]: I0220 17:05:29.170002 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54xkl\" (UniqueName: \"kubernetes.io/projected/0226ebe1-35af-42cb-84fb-acf13171680b-kube-api-access-54xkl\") pod \"community-operators-sl6p2\" (UID: \"0226ebe1-35af-42cb-84fb-acf13171680b\") " pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:29 crc kubenswrapper[4697]: I0220 17:05:29.277541 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:29 crc kubenswrapper[4697]: W0220 17:05:29.748499 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0226ebe1_35af_42cb_84fb_acf13171680b.slice/crio-ece31c0dc5c41bd7d3d7b587e3c7bb3055b969f9242ab40e4de9c79457522737 WatchSource:0}: Error finding container ece31c0dc5c41bd7d3d7b587e3c7bb3055b969f9242ab40e4de9c79457522737: Status 404 returned error can't find the container with id ece31c0dc5c41bd7d3d7b587e3c7bb3055b969f9242ab40e4de9c79457522737 Feb 20 17:05:29 crc kubenswrapper[4697]: I0220 17:05:29.751864 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sl6p2"] Feb 20 17:05:30 crc kubenswrapper[4697]: I0220 17:05:30.063257 4697 generic.go:334] "Generic (PLEG): container finished" podID="0226ebe1-35af-42cb-84fb-acf13171680b" containerID="bcfa5c7b144c41bdb0e29fe1900210f451484a9004fe7f9a6c38820d5f1d8ea8" exitCode=0 Feb 20 17:05:30 crc kubenswrapper[4697]: I0220 17:05:30.063297 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl6p2" event={"ID":"0226ebe1-35af-42cb-84fb-acf13171680b","Type":"ContainerDied","Data":"bcfa5c7b144c41bdb0e29fe1900210f451484a9004fe7f9a6c38820d5f1d8ea8"} Feb 20 17:05:30 crc kubenswrapper[4697]: I0220 17:05:30.063528 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl6p2" event={"ID":"0226ebe1-35af-42cb-84fb-acf13171680b","Type":"ContainerStarted","Data":"ece31c0dc5c41bd7d3d7b587e3c7bb3055b969f9242ab40e4de9c79457522737"} Feb 20 17:05:31 crc kubenswrapper[4697]: I0220 17:05:31.073627 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl6p2" event={"ID":"0226ebe1-35af-42cb-84fb-acf13171680b","Type":"ContainerStarted","Data":"16bdda3ed6fe6ea72536d15b572fd2bb14002b732bb6b993b16dca06718a0784"} Feb 20 17:05:32 crc kubenswrapper[4697]: I0220 17:05:32.085690 4697 generic.go:334] "Generic (PLEG): container finished" podID="0226ebe1-35af-42cb-84fb-acf13171680b" containerID="16bdda3ed6fe6ea72536d15b572fd2bb14002b732bb6b993b16dca06718a0784" exitCode=0 Feb 20 17:05:32 crc kubenswrapper[4697]: I0220 17:05:32.085768 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl6p2" event={"ID":"0226ebe1-35af-42cb-84fb-acf13171680b","Type":"ContainerDied","Data":"16bdda3ed6fe6ea72536d15b572fd2bb14002b732bb6b993b16dca06718a0784"} Feb 20 17:05:33 crc kubenswrapper[4697]: I0220 17:05:33.102423 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl6p2" event={"ID":"0226ebe1-35af-42cb-84fb-acf13171680b","Type":"ContainerStarted","Data":"a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7"} Feb 20 17:05:33 crc kubenswrapper[4697]: I0220 17:05:33.148934 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sl6p2" podStartSLOduration=2.744772596 podStartE2EDuration="5.148896853s" podCreationTimestamp="2026-02-20 17:05:28 +0000 UTC" firstStartedPulling="2026-02-20 17:05:30.064851869 +0000 UTC m=+2037.844897267" lastFinishedPulling="2026-02-20 17:05:32.468976116 +0000 UTC m=+2040.249021524" observedRunningTime="2026-02-20 17:05:33.128402106 +0000 UTC m=+2040.908447524" watchObservedRunningTime="2026-02-20 17:05:33.148896853 +0000 UTC m=+2040.928942301" Feb 20 17:05:39 crc kubenswrapper[4697]: I0220 17:05:39.278632 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:39 crc kubenswrapper[4697]: I0220 17:05:39.279188 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:39 crc kubenswrapper[4697]: I0220 17:05:39.330923 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:40 crc kubenswrapper[4697]: I0220 17:05:40.224004 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:40 crc kubenswrapper[4697]: I0220 17:05:40.291409 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sl6p2"] Feb 20 17:05:42 crc kubenswrapper[4697]: I0220 17:05:42.196586 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sl6p2" podUID="0226ebe1-35af-42cb-84fb-acf13171680b" containerName="registry-server" containerID="cri-o://a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7" gracePeriod=2 Feb 20 17:05:42 crc kubenswrapper[4697]: I0220 17:05:42.756054 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:42 crc kubenswrapper[4697]: I0220 17:05:42.827187 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0226ebe1-35af-42cb-84fb-acf13171680b-catalog-content\") pod \"0226ebe1-35af-42cb-84fb-acf13171680b\" (UID: \"0226ebe1-35af-42cb-84fb-acf13171680b\") " Feb 20 17:05:42 crc kubenswrapper[4697]: I0220 17:05:42.827319 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0226ebe1-35af-42cb-84fb-acf13171680b-utilities\") pod \"0226ebe1-35af-42cb-84fb-acf13171680b\" (UID: \"0226ebe1-35af-42cb-84fb-acf13171680b\") " Feb 20 17:05:42 crc kubenswrapper[4697]: I0220 17:05:42.829227 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0226ebe1-35af-42cb-84fb-acf13171680b-utilities" (OuterVolumeSpecName: "utilities") pod "0226ebe1-35af-42cb-84fb-acf13171680b" (UID: "0226ebe1-35af-42cb-84fb-acf13171680b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:05:42 crc kubenswrapper[4697]: I0220 17:05:42.829654 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54xkl\" (UniqueName: \"kubernetes.io/projected/0226ebe1-35af-42cb-84fb-acf13171680b-kube-api-access-54xkl\") pod \"0226ebe1-35af-42cb-84fb-acf13171680b\" (UID: \"0226ebe1-35af-42cb-84fb-acf13171680b\") " Feb 20 17:05:42 crc kubenswrapper[4697]: I0220 17:05:42.833282 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0226ebe1-35af-42cb-84fb-acf13171680b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:42 crc kubenswrapper[4697]: I0220 17:05:42.839053 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0226ebe1-35af-42cb-84fb-acf13171680b-kube-api-access-54xkl" (OuterVolumeSpecName: "kube-api-access-54xkl") pod "0226ebe1-35af-42cb-84fb-acf13171680b" (UID: "0226ebe1-35af-42cb-84fb-acf13171680b"). InnerVolumeSpecName "kube-api-access-54xkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:05:42 crc kubenswrapper[4697]: I0220 17:05:42.875359 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0226ebe1-35af-42cb-84fb-acf13171680b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0226ebe1-35af-42cb-84fb-acf13171680b" (UID: "0226ebe1-35af-42cb-84fb-acf13171680b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:05:42 crc kubenswrapper[4697]: I0220 17:05:42.935618 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54xkl\" (UniqueName: \"kubernetes.io/projected/0226ebe1-35af-42cb-84fb-acf13171680b-kube-api-access-54xkl\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:42 crc kubenswrapper[4697]: I0220 17:05:42.935661 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0226ebe1-35af-42cb-84fb-acf13171680b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.206769 4697 generic.go:334] "Generic (PLEG): container finished" podID="0226ebe1-35af-42cb-84fb-acf13171680b" containerID="a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7" exitCode=0 Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.206811 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl6p2" event={"ID":"0226ebe1-35af-42cb-84fb-acf13171680b","Type":"ContainerDied","Data":"a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7"} Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.206837 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl6p2" event={"ID":"0226ebe1-35af-42cb-84fb-acf13171680b","Type":"ContainerDied","Data":"ece31c0dc5c41bd7d3d7b587e3c7bb3055b969f9242ab40e4de9c79457522737"} Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.206837 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sl6p2" Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.206856 4697 scope.go:117] "RemoveContainer" containerID="a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7" Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.236784 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sl6p2"] Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.238548 4697 scope.go:117] "RemoveContainer" containerID="16bdda3ed6fe6ea72536d15b572fd2bb14002b732bb6b993b16dca06718a0784" Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.248143 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sl6p2"] Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.270613 4697 scope.go:117] "RemoveContainer" containerID="bcfa5c7b144c41bdb0e29fe1900210f451484a9004fe7f9a6c38820d5f1d8ea8" Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.305436 4697 scope.go:117] "RemoveContainer" containerID="a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7" Feb 20 17:05:43 crc kubenswrapper[4697]: E0220 17:05:43.305935 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7\": container with ID starting with a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7 not found: ID does not exist" containerID="a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7" Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.305997 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7"} err="failed to get container status \"a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7\": rpc error: code = NotFound desc = could not find container \"a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7\": container with ID starting with a541ee5527b8568e4c32c9fedfabf563685bb4f6b9e4f33796470f1313f67cc7 not found: ID does not exist" Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.306041 4697 scope.go:117] "RemoveContainer" containerID="16bdda3ed6fe6ea72536d15b572fd2bb14002b732bb6b993b16dca06718a0784" Feb 20 17:05:43 crc kubenswrapper[4697]: E0220 17:05:43.306422 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16bdda3ed6fe6ea72536d15b572fd2bb14002b732bb6b993b16dca06718a0784\": container with ID starting with 16bdda3ed6fe6ea72536d15b572fd2bb14002b732bb6b993b16dca06718a0784 not found: ID does not exist" containerID="16bdda3ed6fe6ea72536d15b572fd2bb14002b732bb6b993b16dca06718a0784" Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.306455 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16bdda3ed6fe6ea72536d15b572fd2bb14002b732bb6b993b16dca06718a0784"} err="failed to get container status \"16bdda3ed6fe6ea72536d15b572fd2bb14002b732bb6b993b16dca06718a0784\": rpc error: code = NotFound desc = could not find container \"16bdda3ed6fe6ea72536d15b572fd2bb14002b732bb6b993b16dca06718a0784\": container with ID starting with 16bdda3ed6fe6ea72536d15b572fd2bb14002b732bb6b993b16dca06718a0784 not found: ID does not exist" Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.306472 4697 scope.go:117] "RemoveContainer" containerID="bcfa5c7b144c41bdb0e29fe1900210f451484a9004fe7f9a6c38820d5f1d8ea8" Feb 20 17:05:43 crc kubenswrapper[4697]: E0220 17:05:43.306811 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcfa5c7b144c41bdb0e29fe1900210f451484a9004fe7f9a6c38820d5f1d8ea8\": container with ID starting with bcfa5c7b144c41bdb0e29fe1900210f451484a9004fe7f9a6c38820d5f1d8ea8 not found: ID does not exist" containerID="bcfa5c7b144c41bdb0e29fe1900210f451484a9004fe7f9a6c38820d5f1d8ea8" Feb 20 17:05:43 crc kubenswrapper[4697]: I0220 17:05:43.306861 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcfa5c7b144c41bdb0e29fe1900210f451484a9004fe7f9a6c38820d5f1d8ea8"} err="failed to get container status \"bcfa5c7b144c41bdb0e29fe1900210f451484a9004fe7f9a6c38820d5f1d8ea8\": rpc error: code = NotFound desc = could not find container \"bcfa5c7b144c41bdb0e29fe1900210f451484a9004fe7f9a6c38820d5f1d8ea8\": container with ID starting with bcfa5c7b144c41bdb0e29fe1900210f451484a9004fe7f9a6c38820d5f1d8ea8 not found: ID does not exist" Feb 20 17:05:44 crc kubenswrapper[4697]: I0220 17:05:44.897939 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0226ebe1-35af-42cb-84fb-acf13171680b" path="/var/lib/kubelet/pods/0226ebe1-35af-42cb-84fb-acf13171680b/volumes" Feb 20 17:06:30 crc kubenswrapper[4697]: I0220 17:06:30.735124 4697 generic.go:334] "Generic (PLEG): container finished" podID="b1aebd55-3d79-403b-978d-04afedd25c3d" containerID="0765838de471a397e32a4ed6173c17a1e877197a48b0805cf057904be6fd9e14" exitCode=0 Feb 20 17:06:30 crc kubenswrapper[4697]: I0220 17:06:30.735334 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" event={"ID":"b1aebd55-3d79-403b-978d-04afedd25c3d","Type":"ContainerDied","Data":"0765838de471a397e32a4ed6173c17a1e877197a48b0805cf057904be6fd9e14"} Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.153544 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.301082 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-inventory\") pod \"b1aebd55-3d79-403b-978d-04afedd25c3d\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.301173 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b1aebd55-3d79-403b-978d-04afedd25c3d-ovncontroller-config-0\") pod \"b1aebd55-3d79-403b-978d-04afedd25c3d\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.301786 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-ovn-combined-ca-bundle\") pod \"b1aebd55-3d79-403b-978d-04afedd25c3d\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.301974 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-ssh-key-openstack-edpm-ipam\") pod \"b1aebd55-3d79-403b-978d-04afedd25c3d\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.302045 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq7gb\" (UniqueName: \"kubernetes.io/projected/b1aebd55-3d79-403b-978d-04afedd25c3d-kube-api-access-hq7gb\") pod \"b1aebd55-3d79-403b-978d-04afedd25c3d\" (UID: \"b1aebd55-3d79-403b-978d-04afedd25c3d\") " Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.307817 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b1aebd55-3d79-403b-978d-04afedd25c3d" (UID: "b1aebd55-3d79-403b-978d-04afedd25c3d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.308834 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1aebd55-3d79-403b-978d-04afedd25c3d-kube-api-access-hq7gb" (OuterVolumeSpecName: "kube-api-access-hq7gb") pod "b1aebd55-3d79-403b-978d-04afedd25c3d" (UID: "b1aebd55-3d79-403b-978d-04afedd25c3d"). InnerVolumeSpecName "kube-api-access-hq7gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.325932 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1aebd55-3d79-403b-978d-04afedd25c3d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "b1aebd55-3d79-403b-978d-04afedd25c3d" (UID: "b1aebd55-3d79-403b-978d-04afedd25c3d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.341664 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-inventory" (OuterVolumeSpecName: "inventory") pod "b1aebd55-3d79-403b-978d-04afedd25c3d" (UID: "b1aebd55-3d79-403b-978d-04afedd25c3d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.345841 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b1aebd55-3d79-403b-978d-04afedd25c3d" (UID: "b1aebd55-3d79-403b-978d-04afedd25c3d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.404481 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.404516 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq7gb\" (UniqueName: \"kubernetes.io/projected/b1aebd55-3d79-403b-978d-04afedd25c3d-kube-api-access-hq7gb\") on node \"crc\" DevicePath \"\"" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.404531 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.404544 4697 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b1aebd55-3d79-403b-978d-04afedd25c3d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.404557 4697 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1aebd55-3d79-403b-978d-04afedd25c3d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.762973 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" event={"ID":"b1aebd55-3d79-403b-978d-04afedd25c3d","Type":"ContainerDied","Data":"edda3c1be0c9afa0e3c295ec07ca5119edaf1805e6ebd938ee76a8348600ff46"} Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.763411 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edda3c1be0c9afa0e3c295ec07ca5119edaf1805e6ebd938ee76a8348600ff46" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.763022 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-lc2h7" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.858547 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf"] Feb 20 17:06:32 crc kubenswrapper[4697]: E0220 17:06:32.859077 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0226ebe1-35af-42cb-84fb-acf13171680b" containerName="registry-server" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.859103 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="0226ebe1-35af-42cb-84fb-acf13171680b" containerName="registry-server" Feb 20 17:06:32 crc kubenswrapper[4697]: E0220 17:06:32.859145 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0226ebe1-35af-42cb-84fb-acf13171680b" containerName="extract-content" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.859157 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="0226ebe1-35af-42cb-84fb-acf13171680b" containerName="extract-content" Feb 20 17:06:32 crc kubenswrapper[4697]: E0220 17:06:32.859187 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1aebd55-3d79-403b-978d-04afedd25c3d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.859199 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1aebd55-3d79-403b-978d-04afedd25c3d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 20 17:06:32 crc kubenswrapper[4697]: E0220 17:06:32.859225 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0226ebe1-35af-42cb-84fb-acf13171680b" containerName="extract-utilities" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.859236 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="0226ebe1-35af-42cb-84fb-acf13171680b" containerName="extract-utilities" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.859524 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1aebd55-3d79-403b-978d-04afedd25c3d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.859557 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="0226ebe1-35af-42cb-84fb-acf13171680b" containerName="registry-server" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.863663 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.867001 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.867060 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.867237 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.867354 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.867422 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.867622 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.871670 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf"] Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.953559 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psx55\" (UniqueName: \"kubernetes.io/projected/8190a9c0-1f92-4f97-8d67-04668a6920a2-kube-api-access-psx55\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.953948 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.954111 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.954246 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.954358 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:32 crc kubenswrapper[4697]: I0220 17:06:32.954478 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.055715 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.056053 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.056211 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.056337 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.056513 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psx55\" (UniqueName: \"kubernetes.io/projected/8190a9c0-1f92-4f97-8d67-04668a6920a2-kube-api-access-psx55\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.057182 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.060546 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.060619 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.066667 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: E0220 17:06:33.067522 4697 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1aebd55_3d79_403b_978d_04afedd25c3d.slice/crio-edda3c1be0c9afa0e3c295ec07ca5119edaf1805e6ebd938ee76a8348600ff46\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1aebd55_3d79_403b_978d_04afedd25c3d.slice\": RecentStats: unable to find data in memory cache]" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.069262 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.070021 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.086212 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psx55\" (UniqueName: \"kubernetes.io/projected/8190a9c0-1f92-4f97-8d67-04668a6920a2-kube-api-access-psx55\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.203014 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.709813 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf"] Feb 20 17:06:33 crc kubenswrapper[4697]: I0220 17:06:33.777583 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" event={"ID":"8190a9c0-1f92-4f97-8d67-04668a6920a2","Type":"ContainerStarted","Data":"4116e45a8c43a0455fe6b1958608300a5533fd5adc84e4570e6cf6202d2fcdf2"} Feb 20 17:06:34 crc kubenswrapper[4697]: I0220 17:06:34.786281 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" event={"ID":"8190a9c0-1f92-4f97-8d67-04668a6920a2","Type":"ContainerStarted","Data":"b556d98125b7e80be41d3ae0c104a993a7ce0a8e12632d56607b1897402d53e9"} Feb 20 17:06:34 crc kubenswrapper[4697]: I0220 17:06:34.809127 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" podStartSLOduration=2.153567576 podStartE2EDuration="2.809107982s" podCreationTimestamp="2026-02-20 17:06:32 +0000 UTC" firstStartedPulling="2026-02-20 17:06:33.735805505 +0000 UTC m=+2101.515850953" lastFinishedPulling="2026-02-20 17:06:34.391345961 +0000 UTC m=+2102.171391359" observedRunningTime="2026-02-20 17:06:34.799799616 +0000 UTC m=+2102.579845024" watchObservedRunningTime="2026-02-20 17:06:34.809107982 +0000 UTC m=+2102.589153390" Feb 20 17:07:01 crc kubenswrapper[4697]: I0220 17:07:01.185284 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:07:01 crc kubenswrapper[4697]: I0220 17:07:01.187285 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:07:25 crc kubenswrapper[4697]: I0220 17:07:25.246849 4697 generic.go:334] "Generic (PLEG): container finished" podID="8190a9c0-1f92-4f97-8d67-04668a6920a2" containerID="b556d98125b7e80be41d3ae0c104a993a7ce0a8e12632d56607b1897402d53e9" exitCode=0 Feb 20 17:07:25 crc kubenswrapper[4697]: I0220 17:07:25.246949 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" event={"ID":"8190a9c0-1f92-4f97-8d67-04668a6920a2","Type":"ContainerDied","Data":"b556d98125b7e80be41d3ae0c104a993a7ce0a8e12632d56607b1897402d53e9"} Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.745292 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.842771 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psx55\" (UniqueName: \"kubernetes.io/projected/8190a9c0-1f92-4f97-8d67-04668a6920a2-kube-api-access-psx55\") pod \"8190a9c0-1f92-4f97-8d67-04668a6920a2\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.842829 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"8190a9c0-1f92-4f97-8d67-04668a6920a2\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.842857 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-nova-metadata-neutron-config-0\") pod \"8190a9c0-1f92-4f97-8d67-04668a6920a2\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.842894 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-neutron-metadata-combined-ca-bundle\") pod \"8190a9c0-1f92-4f97-8d67-04668a6920a2\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.842934 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-ssh-key-openstack-edpm-ipam\") pod \"8190a9c0-1f92-4f97-8d67-04668a6920a2\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.843023 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-inventory\") pod \"8190a9c0-1f92-4f97-8d67-04668a6920a2\" (UID: \"8190a9c0-1f92-4f97-8d67-04668a6920a2\") " Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.853644 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8190a9c0-1f92-4f97-8d67-04668a6920a2" (UID: "8190a9c0-1f92-4f97-8d67-04668a6920a2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.853676 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8190a9c0-1f92-4f97-8d67-04668a6920a2-kube-api-access-psx55" (OuterVolumeSpecName: "kube-api-access-psx55") pod "8190a9c0-1f92-4f97-8d67-04668a6920a2" (UID: "8190a9c0-1f92-4f97-8d67-04668a6920a2"). InnerVolumeSpecName "kube-api-access-psx55". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.872790 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8190a9c0-1f92-4f97-8d67-04668a6920a2" (UID: "8190a9c0-1f92-4f97-8d67-04668a6920a2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.876004 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "8190a9c0-1f92-4f97-8d67-04668a6920a2" (UID: "8190a9c0-1f92-4f97-8d67-04668a6920a2"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.876127 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-inventory" (OuterVolumeSpecName: "inventory") pod "8190a9c0-1f92-4f97-8d67-04668a6920a2" (UID: "8190a9c0-1f92-4f97-8d67-04668a6920a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.877552 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "8190a9c0-1f92-4f97-8d67-04668a6920a2" (UID: "8190a9c0-1f92-4f97-8d67-04668a6920a2"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.945078 4697 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.945120 4697 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.945133 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.945146 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.945157 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psx55\" (UniqueName: \"kubernetes.io/projected/8190a9c0-1f92-4f97-8d67-04668a6920a2-kube-api-access-psx55\") on node \"crc\" DevicePath \"\"" Feb 20 17:07:26 crc kubenswrapper[4697]: I0220 17:07:26.945168 4697 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8190a9c0-1f92-4f97-8d67-04668a6920a2-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.267253 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" event={"ID":"8190a9c0-1f92-4f97-8d67-04668a6920a2","Type":"ContainerDied","Data":"4116e45a8c43a0455fe6b1958608300a5533fd5adc84e4570e6cf6202d2fcdf2"} Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.267296 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4116e45a8c43a0455fe6b1958608300a5533fd5adc84e4570e6cf6202d2fcdf2" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.267345 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.357912 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4"] Feb 20 17:07:27 crc kubenswrapper[4697]: E0220 17:07:27.358291 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8190a9c0-1f92-4f97-8d67-04668a6920a2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.358306 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="8190a9c0-1f92-4f97-8d67-04668a6920a2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.358507 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="8190a9c0-1f92-4f97-8d67-04668a6920a2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.359116 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.360755 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.360795 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.361180 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.362087 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.362203 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.373687 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4"] Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.454113 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.454191 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.454309 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.454399 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5mrl\" (UniqueName: \"kubernetes.io/projected/e6c6663b-45c5-4629-98fb-23de62292ee1-kube-api-access-d5mrl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.454483 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.556303 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5mrl\" (UniqueName: \"kubernetes.io/projected/e6c6663b-45c5-4629-98fb-23de62292ee1-kube-api-access-d5mrl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.556366 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.556428 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.556474 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.556536 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.560308 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.561238 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.561303 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.568066 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.575696 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5mrl\" (UniqueName: \"kubernetes.io/projected/e6c6663b-45c5-4629-98fb-23de62292ee1-kube-api-access-d5mrl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:27 crc kubenswrapper[4697]: I0220 17:07:27.676484 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:07:28 crc kubenswrapper[4697]: I0220 17:07:28.226419 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4"] Feb 20 17:07:28 crc kubenswrapper[4697]: I0220 17:07:28.280711 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" event={"ID":"e6c6663b-45c5-4629-98fb-23de62292ee1","Type":"ContainerStarted","Data":"e7b0c749b419fc6ea07e40abd179972a98a7477989a78454497ec1003340f5a2"} Feb 20 17:07:29 crc kubenswrapper[4697]: I0220 17:07:29.295833 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" event={"ID":"e6c6663b-45c5-4629-98fb-23de62292ee1","Type":"ContainerStarted","Data":"a6c22a60fcada9e92cb66879c38c91f372ae8f897c2e94bd5fd9d9eb79e7679b"} Feb 20 17:07:29 crc kubenswrapper[4697]: I0220 17:07:29.313963 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" podStartSLOduration=1.8565008939999998 podStartE2EDuration="2.313942353s" podCreationTimestamp="2026-02-20 17:07:27 +0000 UTC" firstStartedPulling="2026-02-20 17:07:28.230339806 +0000 UTC m=+2156.010385214" lastFinishedPulling="2026-02-20 17:07:28.687781265 +0000 UTC m=+2156.467826673" observedRunningTime="2026-02-20 17:07:29.309469284 +0000 UTC m=+2157.089514692" watchObservedRunningTime="2026-02-20 17:07:29.313942353 +0000 UTC m=+2157.093987761" Feb 20 17:07:31 crc kubenswrapper[4697]: I0220 17:07:31.184829 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:07:31 crc kubenswrapper[4697]: I0220 17:07:31.185199 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:08:01 crc kubenswrapper[4697]: I0220 17:08:01.184724 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:08:01 crc kubenswrapper[4697]: I0220 17:08:01.185473 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:08:01 crc kubenswrapper[4697]: I0220 17:08:01.185537 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 17:08:01 crc kubenswrapper[4697]: I0220 17:08:01.186555 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 17:08:01 crc kubenswrapper[4697]: I0220 17:08:01.186658 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" gracePeriod=600 Feb 20 17:08:01 crc kubenswrapper[4697]: E0220 17:08:01.316119 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:08:01 crc kubenswrapper[4697]: I0220 17:08:01.599987 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" exitCode=0 Feb 20 17:08:01 crc kubenswrapper[4697]: I0220 17:08:01.600031 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3"} Feb 20 17:08:01 crc kubenswrapper[4697]: I0220 17:08:01.600101 4697 scope.go:117] "RemoveContainer" containerID="5ab741227b7557de30c7c29e950e092010c8b169886e816c84c4df58e588aa0d" Feb 20 17:08:01 crc kubenswrapper[4697]: I0220 17:08:01.600914 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:08:01 crc kubenswrapper[4697]: E0220 17:08:01.601309 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:08:13 crc kubenswrapper[4697]: I0220 17:08:13.876975 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:08:13 crc kubenswrapper[4697]: E0220 17:08:13.877806 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:08:24 crc kubenswrapper[4697]: I0220 17:08:24.877861 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:08:24 crc kubenswrapper[4697]: E0220 17:08:24.879201 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:08:36 crc kubenswrapper[4697]: I0220 17:08:36.878066 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:08:36 crc kubenswrapper[4697]: E0220 17:08:36.878876 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:08:50 crc kubenswrapper[4697]: I0220 17:08:50.878209 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:08:50 crc kubenswrapper[4697]: E0220 17:08:50.879283 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:09:04 crc kubenswrapper[4697]: I0220 17:09:04.877364 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:09:04 crc kubenswrapper[4697]: E0220 17:09:04.878292 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:09:16 crc kubenswrapper[4697]: I0220 17:09:16.877562 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:09:16 crc kubenswrapper[4697]: E0220 17:09:16.878866 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:09:29 crc kubenswrapper[4697]: I0220 17:09:29.877788 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:09:29 crc kubenswrapper[4697]: E0220 17:09:29.879104 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:09:44 crc kubenswrapper[4697]: I0220 17:09:44.877073 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:09:44 crc kubenswrapper[4697]: E0220 17:09:44.877739 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:09:58 crc kubenswrapper[4697]: I0220 17:09:58.877144 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:09:58 crc kubenswrapper[4697]: E0220 17:09:58.878277 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:10:09 crc kubenswrapper[4697]: I0220 17:10:09.878689 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:10:09 crc kubenswrapper[4697]: E0220 17:10:09.879578 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:10:23 crc kubenswrapper[4697]: I0220 17:10:23.877851 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:10:23 crc kubenswrapper[4697]: E0220 17:10:23.879261 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:10:37 crc kubenswrapper[4697]: I0220 17:10:37.877131 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:10:37 crc kubenswrapper[4697]: E0220 17:10:37.877908 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:10:51 crc kubenswrapper[4697]: I0220 17:10:51.877189 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:10:51 crc kubenswrapper[4697]: E0220 17:10:51.877912 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:11:06 crc kubenswrapper[4697]: I0220 17:11:06.877914 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:11:06 crc kubenswrapper[4697]: E0220 17:11:06.880199 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:11:16 crc kubenswrapper[4697]: I0220 17:11:16.825786 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zjvdq"] Feb 20 17:11:16 crc kubenswrapper[4697]: I0220 17:11:16.828767 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:16 crc kubenswrapper[4697]: I0220 17:11:16.843187 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjvdq"] Feb 20 17:11:16 crc kubenswrapper[4697]: I0220 17:11:16.997158 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6215194-2667-48c3-8eea-25bc74ea3c70-catalog-content\") pod \"redhat-marketplace-zjvdq\" (UID: \"e6215194-2667-48c3-8eea-25bc74ea3c70\") " pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:16 crc kubenswrapper[4697]: I0220 17:11:16.999164 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6215194-2667-48c3-8eea-25bc74ea3c70-utilities\") pod \"redhat-marketplace-zjvdq\" (UID: \"e6215194-2667-48c3-8eea-25bc74ea3c70\") " pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:16 crc kubenswrapper[4697]: I0220 17:11:16.999316 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4fp2\" (UniqueName: \"kubernetes.io/projected/e6215194-2667-48c3-8eea-25bc74ea3c70-kube-api-access-m4fp2\") pod \"redhat-marketplace-zjvdq\" (UID: \"e6215194-2667-48c3-8eea-25bc74ea3c70\") " pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:17 crc kubenswrapper[4697]: I0220 17:11:17.101670 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6215194-2667-48c3-8eea-25bc74ea3c70-catalog-content\") pod \"redhat-marketplace-zjvdq\" (UID: \"e6215194-2667-48c3-8eea-25bc74ea3c70\") " pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:17 crc kubenswrapper[4697]: I0220 17:11:17.102105 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6215194-2667-48c3-8eea-25bc74ea3c70-utilities\") pod \"redhat-marketplace-zjvdq\" (UID: \"e6215194-2667-48c3-8eea-25bc74ea3c70\") " pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:17 crc kubenswrapper[4697]: I0220 17:11:17.102258 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4fp2\" (UniqueName: \"kubernetes.io/projected/e6215194-2667-48c3-8eea-25bc74ea3c70-kube-api-access-m4fp2\") pod \"redhat-marketplace-zjvdq\" (UID: \"e6215194-2667-48c3-8eea-25bc74ea3c70\") " pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:17 crc kubenswrapper[4697]: I0220 17:11:17.102374 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6215194-2667-48c3-8eea-25bc74ea3c70-catalog-content\") pod \"redhat-marketplace-zjvdq\" (UID: \"e6215194-2667-48c3-8eea-25bc74ea3c70\") " pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:17 crc kubenswrapper[4697]: I0220 17:11:17.102647 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6215194-2667-48c3-8eea-25bc74ea3c70-utilities\") pod \"redhat-marketplace-zjvdq\" (UID: \"e6215194-2667-48c3-8eea-25bc74ea3c70\") " pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:17 crc kubenswrapper[4697]: I0220 17:11:17.129479 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4fp2\" (UniqueName: \"kubernetes.io/projected/e6215194-2667-48c3-8eea-25bc74ea3c70-kube-api-access-m4fp2\") pod \"redhat-marketplace-zjvdq\" (UID: \"e6215194-2667-48c3-8eea-25bc74ea3c70\") " pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:17 crc kubenswrapper[4697]: I0220 17:11:17.154565 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:17 crc kubenswrapper[4697]: I0220 17:11:17.689086 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjvdq"] Feb 20 17:11:17 crc kubenswrapper[4697]: I0220 17:11:17.877822 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:11:17 crc kubenswrapper[4697]: E0220 17:11:17.880075 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:11:18 crc kubenswrapper[4697]: I0220 17:11:18.576591 4697 generic.go:334] "Generic (PLEG): container finished" podID="e6215194-2667-48c3-8eea-25bc74ea3c70" containerID="4c77870c93eb9985838d8fd61d28e7b12e14524b04a94ee4abf3019433d5db3f" exitCode=0 Feb 20 17:11:18 crc kubenswrapper[4697]: I0220 17:11:18.576665 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvdq" event={"ID":"e6215194-2667-48c3-8eea-25bc74ea3c70","Type":"ContainerDied","Data":"4c77870c93eb9985838d8fd61d28e7b12e14524b04a94ee4abf3019433d5db3f"} Feb 20 17:11:18 crc kubenswrapper[4697]: I0220 17:11:18.576727 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvdq" event={"ID":"e6215194-2667-48c3-8eea-25bc74ea3c70","Type":"ContainerStarted","Data":"aba62e2c05f7ff24ff97bab16b9ad9508349868f733e9bdafe3eaf39cf51bbc8"} Feb 20 17:11:18 crc kubenswrapper[4697]: I0220 17:11:18.579033 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 17:11:19 crc kubenswrapper[4697]: I0220 17:11:19.589301 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvdq" event={"ID":"e6215194-2667-48c3-8eea-25bc74ea3c70","Type":"ContainerStarted","Data":"552b3a4678ca569015b4c10245d87587610558cc9b6112c637486dc38bed9690"} Feb 20 17:11:20 crc kubenswrapper[4697]: I0220 17:11:20.600426 4697 generic.go:334] "Generic (PLEG): container finished" podID="e6215194-2667-48c3-8eea-25bc74ea3c70" containerID="552b3a4678ca569015b4c10245d87587610558cc9b6112c637486dc38bed9690" exitCode=0 Feb 20 17:11:20 crc kubenswrapper[4697]: I0220 17:11:20.600550 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvdq" event={"ID":"e6215194-2667-48c3-8eea-25bc74ea3c70","Type":"ContainerDied","Data":"552b3a4678ca569015b4c10245d87587610558cc9b6112c637486dc38bed9690"} Feb 20 17:11:21 crc kubenswrapper[4697]: I0220 17:11:21.634170 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvdq" event={"ID":"e6215194-2667-48c3-8eea-25bc74ea3c70","Type":"ContainerStarted","Data":"fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1"} Feb 20 17:11:21 crc kubenswrapper[4697]: I0220 17:11:21.657970 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zjvdq" podStartSLOduration=3.252299425 podStartE2EDuration="5.657951268s" podCreationTimestamp="2026-02-20 17:11:16 +0000 UTC" firstStartedPulling="2026-02-20 17:11:18.578378991 +0000 UTC m=+2386.358424449" lastFinishedPulling="2026-02-20 17:11:20.984030874 +0000 UTC m=+2388.764076292" observedRunningTime="2026-02-20 17:11:21.654218057 +0000 UTC m=+2389.434263465" watchObservedRunningTime="2026-02-20 17:11:21.657951268 +0000 UTC m=+2389.437996686" Feb 20 17:11:27 crc kubenswrapper[4697]: I0220 17:11:27.155487 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:27 crc kubenswrapper[4697]: I0220 17:11:27.156353 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:27 crc kubenswrapper[4697]: I0220 17:11:27.213118 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:27 crc kubenswrapper[4697]: I0220 17:11:27.752997 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:27 crc kubenswrapper[4697]: I0220 17:11:27.800579 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjvdq"] Feb 20 17:11:29 crc kubenswrapper[4697]: I0220 17:11:29.718710 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zjvdq" podUID="e6215194-2667-48c3-8eea-25bc74ea3c70" containerName="registry-server" containerID="cri-o://fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1" gracePeriod=2 Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.188480 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.371004 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4fp2\" (UniqueName: \"kubernetes.io/projected/e6215194-2667-48c3-8eea-25bc74ea3c70-kube-api-access-m4fp2\") pod \"e6215194-2667-48c3-8eea-25bc74ea3c70\" (UID: \"e6215194-2667-48c3-8eea-25bc74ea3c70\") " Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.371142 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6215194-2667-48c3-8eea-25bc74ea3c70-catalog-content\") pod \"e6215194-2667-48c3-8eea-25bc74ea3c70\" (UID: \"e6215194-2667-48c3-8eea-25bc74ea3c70\") " Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.371257 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6215194-2667-48c3-8eea-25bc74ea3c70-utilities\") pod \"e6215194-2667-48c3-8eea-25bc74ea3c70\" (UID: \"e6215194-2667-48c3-8eea-25bc74ea3c70\") " Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.372660 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6215194-2667-48c3-8eea-25bc74ea3c70-utilities" (OuterVolumeSpecName: "utilities") pod "e6215194-2667-48c3-8eea-25bc74ea3c70" (UID: "e6215194-2667-48c3-8eea-25bc74ea3c70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.372946 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6215194-2667-48c3-8eea-25bc74ea3c70-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.375961 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6215194-2667-48c3-8eea-25bc74ea3c70-kube-api-access-m4fp2" (OuterVolumeSpecName: "kube-api-access-m4fp2") pod "e6215194-2667-48c3-8eea-25bc74ea3c70" (UID: "e6215194-2667-48c3-8eea-25bc74ea3c70"). InnerVolumeSpecName "kube-api-access-m4fp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.395245 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6215194-2667-48c3-8eea-25bc74ea3c70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6215194-2667-48c3-8eea-25bc74ea3c70" (UID: "e6215194-2667-48c3-8eea-25bc74ea3c70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.474680 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6215194-2667-48c3-8eea-25bc74ea3c70-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.474709 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4fp2\" (UniqueName: \"kubernetes.io/projected/e6215194-2667-48c3-8eea-25bc74ea3c70-kube-api-access-m4fp2\") on node \"crc\" DevicePath \"\"" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.728342 4697 generic.go:334] "Generic (PLEG): container finished" podID="e6215194-2667-48c3-8eea-25bc74ea3c70" containerID="fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1" exitCode=0 Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.728420 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjvdq" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.728459 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvdq" event={"ID":"e6215194-2667-48c3-8eea-25bc74ea3c70","Type":"ContainerDied","Data":"fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1"} Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.729677 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvdq" event={"ID":"e6215194-2667-48c3-8eea-25bc74ea3c70","Type":"ContainerDied","Data":"aba62e2c05f7ff24ff97bab16b9ad9508349868f733e9bdafe3eaf39cf51bbc8"} Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.729724 4697 scope.go:117] "RemoveContainer" containerID="fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.752613 4697 scope.go:117] "RemoveContainer" containerID="552b3a4678ca569015b4c10245d87587610558cc9b6112c637486dc38bed9690" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.771648 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjvdq"] Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.776221 4697 scope.go:117] "RemoveContainer" containerID="4c77870c93eb9985838d8fd61d28e7b12e14524b04a94ee4abf3019433d5db3f" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.791724 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjvdq"] Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.823590 4697 scope.go:117] "RemoveContainer" containerID="fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1" Feb 20 17:11:30 crc kubenswrapper[4697]: E0220 17:11:30.824017 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1\": container with ID starting with fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1 not found: ID does not exist" containerID="fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.824057 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1"} err="failed to get container status \"fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1\": rpc error: code = NotFound desc = could not find container \"fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1\": container with ID starting with fe8434f2656147874f59098fa3e564d49ad672e58424db53280410036f6dabd1 not found: ID does not exist" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.824086 4697 scope.go:117] "RemoveContainer" containerID="552b3a4678ca569015b4c10245d87587610558cc9b6112c637486dc38bed9690" Feb 20 17:11:30 crc kubenswrapper[4697]: E0220 17:11:30.824633 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"552b3a4678ca569015b4c10245d87587610558cc9b6112c637486dc38bed9690\": container with ID starting with 552b3a4678ca569015b4c10245d87587610558cc9b6112c637486dc38bed9690 not found: ID does not exist" containerID="552b3a4678ca569015b4c10245d87587610558cc9b6112c637486dc38bed9690" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.824652 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552b3a4678ca569015b4c10245d87587610558cc9b6112c637486dc38bed9690"} err="failed to get container status \"552b3a4678ca569015b4c10245d87587610558cc9b6112c637486dc38bed9690\": rpc error: code = NotFound desc = could not find container \"552b3a4678ca569015b4c10245d87587610558cc9b6112c637486dc38bed9690\": container with ID starting with 552b3a4678ca569015b4c10245d87587610558cc9b6112c637486dc38bed9690 not found: ID does not exist" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.824665 4697 scope.go:117] "RemoveContainer" containerID="4c77870c93eb9985838d8fd61d28e7b12e14524b04a94ee4abf3019433d5db3f" Feb 20 17:11:30 crc kubenswrapper[4697]: E0220 17:11:30.824998 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c77870c93eb9985838d8fd61d28e7b12e14524b04a94ee4abf3019433d5db3f\": container with ID starting with 4c77870c93eb9985838d8fd61d28e7b12e14524b04a94ee4abf3019433d5db3f not found: ID does not exist" containerID="4c77870c93eb9985838d8fd61d28e7b12e14524b04a94ee4abf3019433d5db3f" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.825025 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c77870c93eb9985838d8fd61d28e7b12e14524b04a94ee4abf3019433d5db3f"} err="failed to get container status \"4c77870c93eb9985838d8fd61d28e7b12e14524b04a94ee4abf3019433d5db3f\": rpc error: code = NotFound desc = could not find container \"4c77870c93eb9985838d8fd61d28e7b12e14524b04a94ee4abf3019433d5db3f\": container with ID starting with 4c77870c93eb9985838d8fd61d28e7b12e14524b04a94ee4abf3019433d5db3f not found: ID does not exist" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.877564 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:11:30 crc kubenswrapper[4697]: E0220 17:11:30.877810 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:11:30 crc kubenswrapper[4697]: I0220 17:11:30.886940 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6215194-2667-48c3-8eea-25bc74ea3c70" path="/var/lib/kubelet/pods/e6215194-2667-48c3-8eea-25bc74ea3c70/volumes" Feb 20 17:11:32 crc kubenswrapper[4697]: I0220 17:11:32.766545 4697 generic.go:334] "Generic (PLEG): container finished" podID="e6c6663b-45c5-4629-98fb-23de62292ee1" containerID="a6c22a60fcada9e92cb66879c38c91f372ae8f897c2e94bd5fd9d9eb79e7679b" exitCode=0 Feb 20 17:11:32 crc kubenswrapper[4697]: I0220 17:11:32.766627 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" event={"ID":"e6c6663b-45c5-4629-98fb-23de62292ee1","Type":"ContainerDied","Data":"a6c22a60fcada9e92cb66879c38c91f372ae8f897c2e94bd5fd9d9eb79e7679b"} Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.200532 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.348592 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-libvirt-secret-0\") pod \"e6c6663b-45c5-4629-98fb-23de62292ee1\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.348733 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-ssh-key-openstack-edpm-ipam\") pod \"e6c6663b-45c5-4629-98fb-23de62292ee1\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.348782 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-inventory\") pod \"e6c6663b-45c5-4629-98fb-23de62292ee1\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.348905 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5mrl\" (UniqueName: \"kubernetes.io/projected/e6c6663b-45c5-4629-98fb-23de62292ee1-kube-api-access-d5mrl\") pod \"e6c6663b-45c5-4629-98fb-23de62292ee1\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.348974 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-libvirt-combined-ca-bundle\") pod \"e6c6663b-45c5-4629-98fb-23de62292ee1\" (UID: \"e6c6663b-45c5-4629-98fb-23de62292ee1\") " Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.353963 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e6c6663b-45c5-4629-98fb-23de62292ee1" (UID: "e6c6663b-45c5-4629-98fb-23de62292ee1"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.355697 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c6663b-45c5-4629-98fb-23de62292ee1-kube-api-access-d5mrl" (OuterVolumeSpecName: "kube-api-access-d5mrl") pod "e6c6663b-45c5-4629-98fb-23de62292ee1" (UID: "e6c6663b-45c5-4629-98fb-23de62292ee1"). InnerVolumeSpecName "kube-api-access-d5mrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.377973 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e6c6663b-45c5-4629-98fb-23de62292ee1" (UID: "e6c6663b-45c5-4629-98fb-23de62292ee1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.378809 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-inventory" (OuterVolumeSpecName: "inventory") pod "e6c6663b-45c5-4629-98fb-23de62292ee1" (UID: "e6c6663b-45c5-4629-98fb-23de62292ee1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.381052 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e6c6663b-45c5-4629-98fb-23de62292ee1" (UID: "e6c6663b-45c5-4629-98fb-23de62292ee1"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.451342 4697 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.451384 4697 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.451397 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.451409 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6c6663b-45c5-4629-98fb-23de62292ee1-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.451422 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5mrl\" (UniqueName: \"kubernetes.io/projected/e6c6663b-45c5-4629-98fb-23de62292ee1-kube-api-access-d5mrl\") on node \"crc\" DevicePath \"\"" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.787280 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" event={"ID":"e6c6663b-45c5-4629-98fb-23de62292ee1","Type":"ContainerDied","Data":"e7b0c749b419fc6ea07e40abd179972a98a7477989a78454497ec1003340f5a2"} Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.787316 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7b0c749b419fc6ea07e40abd179972a98a7477989a78454497ec1003340f5a2" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.787363 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.893804 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf"] Feb 20 17:11:34 crc kubenswrapper[4697]: E0220 17:11:34.894270 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6215194-2667-48c3-8eea-25bc74ea3c70" containerName="registry-server" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.894296 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6215194-2667-48c3-8eea-25bc74ea3c70" containerName="registry-server" Feb 20 17:11:34 crc kubenswrapper[4697]: E0220 17:11:34.894324 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6215194-2667-48c3-8eea-25bc74ea3c70" containerName="extract-content" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.894336 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6215194-2667-48c3-8eea-25bc74ea3c70" containerName="extract-content" Feb 20 17:11:34 crc kubenswrapper[4697]: E0220 17:11:34.894354 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c6663b-45c5-4629-98fb-23de62292ee1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.894366 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c6663b-45c5-4629-98fb-23de62292ee1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 20 17:11:34 crc kubenswrapper[4697]: E0220 17:11:34.894405 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6215194-2667-48c3-8eea-25bc74ea3c70" containerName="extract-utilities" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.894417 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6215194-2667-48c3-8eea-25bc74ea3c70" containerName="extract-utilities" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.894722 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c6663b-45c5-4629-98fb-23de62292ee1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.894767 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6215194-2667-48c3-8eea-25bc74ea3c70" containerName="registry-server" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.895633 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.897913 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.898212 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.898250 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.898455 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.898595 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.899225 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.899740 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:11:34 crc kubenswrapper[4697]: I0220 17:11:34.902891 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf"] Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.060707 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtdbh\" (UniqueName: \"kubernetes.io/projected/62242a65-ea27-495f-aa04-4a274f9e771a-kube-api-access-wtdbh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.060794 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.060944 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.061009 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.061071 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.061094 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.061119 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.061143 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.061178 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.061251 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/62242a65-ea27-495f-aa04-4a274f9e771a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.061547 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.163592 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtdbh\" (UniqueName: \"kubernetes.io/projected/62242a65-ea27-495f-aa04-4a274f9e771a-kube-api-access-wtdbh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.164082 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.164210 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.164333 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.164474 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.164553 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.164649 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.164739 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.164957 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.165396 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/62242a65-ea27-495f-aa04-4a274f9e771a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.165601 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.168006 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.168050 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/62242a65-ea27-495f-aa04-4a274f9e771a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.169020 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.169399 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.169602 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.170063 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.170064 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.170778 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.172961 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.174874 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.184001 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtdbh\" (UniqueName: \"kubernetes.io/projected/62242a65-ea27-495f-aa04-4a274f9e771a-kube-api-access-wtdbh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wzxdf\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.225239 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.741481 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf"] Feb 20 17:11:35 crc kubenswrapper[4697]: I0220 17:11:35.801793 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" event={"ID":"62242a65-ea27-495f-aa04-4a274f9e771a","Type":"ContainerStarted","Data":"f4552b003a9d0919cb212eb5d043ca95550a1fe351b4ce8ee869efd379a36233"} Feb 20 17:11:36 crc kubenswrapper[4697]: I0220 17:11:36.817470 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" event={"ID":"62242a65-ea27-495f-aa04-4a274f9e771a","Type":"ContainerStarted","Data":"b99cd6a9b9913fb4a4929bcb99ce23af1fdad21369bb1d4fcb713b37bffbc83c"} Feb 20 17:11:36 crc kubenswrapper[4697]: I0220 17:11:36.851494 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" podStartSLOduration=2.443912761 podStartE2EDuration="2.851469203s" podCreationTimestamp="2026-02-20 17:11:34 +0000 UTC" firstStartedPulling="2026-02-20 17:11:35.741825558 +0000 UTC m=+2403.521870976" lastFinishedPulling="2026-02-20 17:11:36.14938201 +0000 UTC m=+2403.929427418" observedRunningTime="2026-02-20 17:11:36.849484694 +0000 UTC m=+2404.629530102" watchObservedRunningTime="2026-02-20 17:11:36.851469203 +0000 UTC m=+2404.631514651" Feb 20 17:11:41 crc kubenswrapper[4697]: I0220 17:11:41.877704 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:11:41 crc kubenswrapper[4697]: E0220 17:11:41.879522 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:11:55 crc kubenswrapper[4697]: I0220 17:11:55.876707 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:11:55 crc kubenswrapper[4697]: E0220 17:11:55.877449 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:12:09 crc kubenswrapper[4697]: I0220 17:12:09.877000 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:12:09 crc kubenswrapper[4697]: E0220 17:12:09.877938 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:12:24 crc kubenswrapper[4697]: I0220 17:12:24.878862 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:12:24 crc kubenswrapper[4697]: E0220 17:12:24.879757 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:12:36 crc kubenswrapper[4697]: I0220 17:12:36.876876 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:12:36 crc kubenswrapper[4697]: E0220 17:12:36.877615 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:12:50 crc kubenswrapper[4697]: I0220 17:12:50.878765 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:12:50 crc kubenswrapper[4697]: E0220 17:12:50.880060 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:13:02 crc kubenswrapper[4697]: I0220 17:13:02.884006 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:13:03 crc kubenswrapper[4697]: I0220 17:13:03.710084 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"9d496f5c2a2fdecd518eeefb67465f7013c83166b86974cefe0c57b35aaead31"} Feb 20 17:13:04 crc kubenswrapper[4697]: I0220 17:13:04.904605 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w5pgx"] Feb 20 17:13:04 crc kubenswrapper[4697]: I0220 17:13:04.908143 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:04 crc kubenswrapper[4697]: I0220 17:13:04.919185 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5pgx"] Feb 20 17:13:04 crc kubenswrapper[4697]: I0220 17:13:04.990610 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/361edc64-43a5-479d-8c0b-de859fb1167f-catalog-content\") pod \"certified-operators-w5pgx\" (UID: \"361edc64-43a5-479d-8c0b-de859fb1167f\") " pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:04 crc kubenswrapper[4697]: I0220 17:13:04.990809 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/361edc64-43a5-479d-8c0b-de859fb1167f-utilities\") pod \"certified-operators-w5pgx\" (UID: \"361edc64-43a5-479d-8c0b-de859fb1167f\") " pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:04 crc kubenswrapper[4697]: I0220 17:13:04.991046 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqwlj\" (UniqueName: \"kubernetes.io/projected/361edc64-43a5-479d-8c0b-de859fb1167f-kube-api-access-hqwlj\") pod \"certified-operators-w5pgx\" (UID: \"361edc64-43a5-479d-8c0b-de859fb1167f\") " pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:05 crc kubenswrapper[4697]: I0220 17:13:05.094077 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/361edc64-43a5-479d-8c0b-de859fb1167f-utilities\") pod \"certified-operators-w5pgx\" (UID: \"361edc64-43a5-479d-8c0b-de859fb1167f\") " pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:05 crc kubenswrapper[4697]: I0220 17:13:05.094282 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqwlj\" (UniqueName: \"kubernetes.io/projected/361edc64-43a5-479d-8c0b-de859fb1167f-kube-api-access-hqwlj\") pod \"certified-operators-w5pgx\" (UID: \"361edc64-43a5-479d-8c0b-de859fb1167f\") " pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:05 crc kubenswrapper[4697]: I0220 17:13:05.094481 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/361edc64-43a5-479d-8c0b-de859fb1167f-catalog-content\") pod \"certified-operators-w5pgx\" (UID: \"361edc64-43a5-479d-8c0b-de859fb1167f\") " pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:05 crc kubenswrapper[4697]: I0220 17:13:05.094775 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/361edc64-43a5-479d-8c0b-de859fb1167f-utilities\") pod \"certified-operators-w5pgx\" (UID: \"361edc64-43a5-479d-8c0b-de859fb1167f\") " pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:05 crc kubenswrapper[4697]: I0220 17:13:05.094886 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/361edc64-43a5-479d-8c0b-de859fb1167f-catalog-content\") pod \"certified-operators-w5pgx\" (UID: \"361edc64-43a5-479d-8c0b-de859fb1167f\") " pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:05 crc kubenswrapper[4697]: I0220 17:13:05.114696 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqwlj\" (UniqueName: \"kubernetes.io/projected/361edc64-43a5-479d-8c0b-de859fb1167f-kube-api-access-hqwlj\") pod \"certified-operators-w5pgx\" (UID: \"361edc64-43a5-479d-8c0b-de859fb1167f\") " pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:05 crc kubenswrapper[4697]: I0220 17:13:05.272684 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:05 crc kubenswrapper[4697]: I0220 17:13:05.846723 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5pgx"] Feb 20 17:13:05 crc kubenswrapper[4697]: W0220 17:13:05.858854 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod361edc64_43a5_479d_8c0b_de859fb1167f.slice/crio-96da26b73b4564f4426a20a2d2ed1d07480b4e62b936108df159be740dedb1ac WatchSource:0}: Error finding container 96da26b73b4564f4426a20a2d2ed1d07480b4e62b936108df159be740dedb1ac: Status 404 returned error can't find the container with id 96da26b73b4564f4426a20a2d2ed1d07480b4e62b936108df159be740dedb1ac Feb 20 17:13:06 crc kubenswrapper[4697]: I0220 17:13:06.739367 4697 generic.go:334] "Generic (PLEG): container finished" podID="361edc64-43a5-479d-8c0b-de859fb1167f" containerID="8024cdfbf71b3e15cc6fda3357fbd7a462a25d62ee07e4313438858153a96149" exitCode=0 Feb 20 17:13:06 crc kubenswrapper[4697]: I0220 17:13:06.739473 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5pgx" event={"ID":"361edc64-43a5-479d-8c0b-de859fb1167f","Type":"ContainerDied","Data":"8024cdfbf71b3e15cc6fda3357fbd7a462a25d62ee07e4313438858153a96149"} Feb 20 17:13:06 crc kubenswrapper[4697]: I0220 17:13:06.739813 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5pgx" event={"ID":"361edc64-43a5-479d-8c0b-de859fb1167f","Type":"ContainerStarted","Data":"96da26b73b4564f4426a20a2d2ed1d07480b4e62b936108df159be740dedb1ac"} Feb 20 17:13:08 crc kubenswrapper[4697]: I0220 17:13:08.761443 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5pgx" event={"ID":"361edc64-43a5-479d-8c0b-de859fb1167f","Type":"ContainerStarted","Data":"e273caebf6e582dea605c05bb81c4e370aa0fc15da86e2e607a8923356e68361"} Feb 20 17:13:09 crc kubenswrapper[4697]: I0220 17:13:09.777922 4697 generic.go:334] "Generic (PLEG): container finished" podID="361edc64-43a5-479d-8c0b-de859fb1167f" containerID="e273caebf6e582dea605c05bb81c4e370aa0fc15da86e2e607a8923356e68361" exitCode=0 Feb 20 17:13:09 crc kubenswrapper[4697]: I0220 17:13:09.777988 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5pgx" event={"ID":"361edc64-43a5-479d-8c0b-de859fb1167f","Type":"ContainerDied","Data":"e273caebf6e582dea605c05bb81c4e370aa0fc15da86e2e607a8923356e68361"} Feb 20 17:13:10 crc kubenswrapper[4697]: I0220 17:13:10.790659 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5pgx" event={"ID":"361edc64-43a5-479d-8c0b-de859fb1167f","Type":"ContainerStarted","Data":"209206eb002669894a1e4ffb54804c09a049062a8b4a440028b1b27da4af4fa5"} Feb 20 17:13:10 crc kubenswrapper[4697]: I0220 17:13:10.817212 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w5pgx" podStartSLOduration=3.352202681 podStartE2EDuration="6.817190481s" podCreationTimestamp="2026-02-20 17:13:04 +0000 UTC" firstStartedPulling="2026-02-20 17:13:06.742870709 +0000 UTC m=+2494.522916137" lastFinishedPulling="2026-02-20 17:13:10.207858529 +0000 UTC m=+2497.987903937" observedRunningTime="2026-02-20 17:13:10.812738289 +0000 UTC m=+2498.592783707" watchObservedRunningTime="2026-02-20 17:13:10.817190481 +0000 UTC m=+2498.597235889" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.273841 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.274893 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.335638 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.589866 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sfb9p"] Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.593048 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.605792 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sfb9p"] Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.705959 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26gw7\" (UniqueName: \"kubernetes.io/projected/5534a571-470d-4ec2-8ca7-91c28895aa0a-kube-api-access-26gw7\") pod \"redhat-operators-sfb9p\" (UID: \"5534a571-470d-4ec2-8ca7-91c28895aa0a\") " pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.706015 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534a571-470d-4ec2-8ca7-91c28895aa0a-catalog-content\") pod \"redhat-operators-sfb9p\" (UID: \"5534a571-470d-4ec2-8ca7-91c28895aa0a\") " pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.706053 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534a571-470d-4ec2-8ca7-91c28895aa0a-utilities\") pod \"redhat-operators-sfb9p\" (UID: \"5534a571-470d-4ec2-8ca7-91c28895aa0a\") " pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.808960 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26gw7\" (UniqueName: \"kubernetes.io/projected/5534a571-470d-4ec2-8ca7-91c28895aa0a-kube-api-access-26gw7\") pod \"redhat-operators-sfb9p\" (UID: \"5534a571-470d-4ec2-8ca7-91c28895aa0a\") " pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.809017 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534a571-470d-4ec2-8ca7-91c28895aa0a-catalog-content\") pod \"redhat-operators-sfb9p\" (UID: \"5534a571-470d-4ec2-8ca7-91c28895aa0a\") " pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.809060 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534a571-470d-4ec2-8ca7-91c28895aa0a-utilities\") pod \"redhat-operators-sfb9p\" (UID: \"5534a571-470d-4ec2-8ca7-91c28895aa0a\") " pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.809845 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534a571-470d-4ec2-8ca7-91c28895aa0a-utilities\") pod \"redhat-operators-sfb9p\" (UID: \"5534a571-470d-4ec2-8ca7-91c28895aa0a\") " pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.810173 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534a571-470d-4ec2-8ca7-91c28895aa0a-catalog-content\") pod \"redhat-operators-sfb9p\" (UID: \"5534a571-470d-4ec2-8ca7-91c28895aa0a\") " pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.837694 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26gw7\" (UniqueName: \"kubernetes.io/projected/5534a571-470d-4ec2-8ca7-91c28895aa0a-kube-api-access-26gw7\") pod \"redhat-operators-sfb9p\" (UID: \"5534a571-470d-4ec2-8ca7-91c28895aa0a\") " pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.925965 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:15 crc kubenswrapper[4697]: I0220 17:13:15.961105 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:16 crc kubenswrapper[4697]: I0220 17:13:16.551645 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sfb9p"] Feb 20 17:13:16 crc kubenswrapper[4697]: I0220 17:13:16.867426 4697 generic.go:334] "Generic (PLEG): container finished" podID="5534a571-470d-4ec2-8ca7-91c28895aa0a" containerID="a4971eb2ec03f6c41ad65aba29e6ddd9e596e088b560016122a47e21cc605a07" exitCode=0 Feb 20 17:13:16 crc kubenswrapper[4697]: I0220 17:13:16.867550 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfb9p" event={"ID":"5534a571-470d-4ec2-8ca7-91c28895aa0a","Type":"ContainerDied","Data":"a4971eb2ec03f6c41ad65aba29e6ddd9e596e088b560016122a47e21cc605a07"} Feb 20 17:13:16 crc kubenswrapper[4697]: I0220 17:13:16.867801 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfb9p" event={"ID":"5534a571-470d-4ec2-8ca7-91c28895aa0a","Type":"ContainerStarted","Data":"b20edd88633ebd1c38883bbe7a9ba19e10031656feb7ae22545cd3dd18567354"} Feb 20 17:13:18 crc kubenswrapper[4697]: I0220 17:13:18.375696 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5pgx"] Feb 20 17:13:18 crc kubenswrapper[4697]: I0220 17:13:18.376789 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w5pgx" podUID="361edc64-43a5-479d-8c0b-de859fb1167f" containerName="registry-server" containerID="cri-o://209206eb002669894a1e4ffb54804c09a049062a8b4a440028b1b27da4af4fa5" gracePeriod=2 Feb 20 17:13:18 crc kubenswrapper[4697]: I0220 17:13:18.888956 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfb9p" event={"ID":"5534a571-470d-4ec2-8ca7-91c28895aa0a","Type":"ContainerStarted","Data":"49eac4839e7a93c49c77dcc93e0216cce7e8063d4fab62a311983e1e196a94eb"} Feb 20 17:13:18 crc kubenswrapper[4697]: I0220 17:13:18.892635 4697 generic.go:334] "Generic (PLEG): container finished" podID="361edc64-43a5-479d-8c0b-de859fb1167f" containerID="209206eb002669894a1e4ffb54804c09a049062a8b4a440028b1b27da4af4fa5" exitCode=0 Feb 20 17:13:18 crc kubenswrapper[4697]: I0220 17:13:18.892673 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5pgx" event={"ID":"361edc64-43a5-479d-8c0b-de859fb1167f","Type":"ContainerDied","Data":"209206eb002669894a1e4ffb54804c09a049062a8b4a440028b1b27da4af4fa5"} Feb 20 17:13:18 crc kubenswrapper[4697]: I0220 17:13:18.892692 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5pgx" event={"ID":"361edc64-43a5-479d-8c0b-de859fb1167f","Type":"ContainerDied","Data":"96da26b73b4564f4426a20a2d2ed1d07480b4e62b936108df159be740dedb1ac"} Feb 20 17:13:18 crc kubenswrapper[4697]: I0220 17:13:18.892704 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96da26b73b4564f4426a20a2d2ed1d07480b4e62b936108df159be740dedb1ac" Feb 20 17:13:18 crc kubenswrapper[4697]: I0220 17:13:18.902464 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:19 crc kubenswrapper[4697]: I0220 17:13:19.074551 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/361edc64-43a5-479d-8c0b-de859fb1167f-utilities\") pod \"361edc64-43a5-479d-8c0b-de859fb1167f\" (UID: \"361edc64-43a5-479d-8c0b-de859fb1167f\") " Feb 20 17:13:19 crc kubenswrapper[4697]: I0220 17:13:19.074628 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqwlj\" (UniqueName: \"kubernetes.io/projected/361edc64-43a5-479d-8c0b-de859fb1167f-kube-api-access-hqwlj\") pod \"361edc64-43a5-479d-8c0b-de859fb1167f\" (UID: \"361edc64-43a5-479d-8c0b-de859fb1167f\") " Feb 20 17:13:19 crc kubenswrapper[4697]: I0220 17:13:19.074777 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/361edc64-43a5-479d-8c0b-de859fb1167f-catalog-content\") pod \"361edc64-43a5-479d-8c0b-de859fb1167f\" (UID: \"361edc64-43a5-479d-8c0b-de859fb1167f\") " Feb 20 17:13:19 crc kubenswrapper[4697]: I0220 17:13:19.075305 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/361edc64-43a5-479d-8c0b-de859fb1167f-utilities" (OuterVolumeSpecName: "utilities") pod "361edc64-43a5-479d-8c0b-de859fb1167f" (UID: "361edc64-43a5-479d-8c0b-de859fb1167f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:13:19 crc kubenswrapper[4697]: I0220 17:13:19.082777 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361edc64-43a5-479d-8c0b-de859fb1167f-kube-api-access-hqwlj" (OuterVolumeSpecName: "kube-api-access-hqwlj") pod "361edc64-43a5-479d-8c0b-de859fb1167f" (UID: "361edc64-43a5-479d-8c0b-de859fb1167f"). InnerVolumeSpecName "kube-api-access-hqwlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:13:19 crc kubenswrapper[4697]: I0220 17:13:19.134130 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/361edc64-43a5-479d-8c0b-de859fb1167f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "361edc64-43a5-479d-8c0b-de859fb1167f" (UID: "361edc64-43a5-479d-8c0b-de859fb1167f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:13:19 crc kubenswrapper[4697]: I0220 17:13:19.177157 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/361edc64-43a5-479d-8c0b-de859fb1167f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:13:19 crc kubenswrapper[4697]: I0220 17:13:19.177192 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqwlj\" (UniqueName: \"kubernetes.io/projected/361edc64-43a5-479d-8c0b-de859fb1167f-kube-api-access-hqwlj\") on node \"crc\" DevicePath \"\"" Feb 20 17:13:19 crc kubenswrapper[4697]: I0220 17:13:19.177206 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/361edc64-43a5-479d-8c0b-de859fb1167f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:13:19 crc kubenswrapper[4697]: I0220 17:13:19.902995 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5pgx" Feb 20 17:13:19 crc kubenswrapper[4697]: I0220 17:13:19.938871 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5pgx"] Feb 20 17:13:19 crc kubenswrapper[4697]: I0220 17:13:19.945315 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w5pgx"] Feb 20 17:13:20 crc kubenswrapper[4697]: I0220 17:13:20.886397 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="361edc64-43a5-479d-8c0b-de859fb1167f" path="/var/lib/kubelet/pods/361edc64-43a5-479d-8c0b-de859fb1167f/volumes" Feb 20 17:13:21 crc kubenswrapper[4697]: I0220 17:13:21.922462 4697 generic.go:334] "Generic (PLEG): container finished" podID="5534a571-470d-4ec2-8ca7-91c28895aa0a" containerID="49eac4839e7a93c49c77dcc93e0216cce7e8063d4fab62a311983e1e196a94eb" exitCode=0 Feb 20 17:13:21 crc kubenswrapper[4697]: I0220 17:13:21.922474 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfb9p" event={"ID":"5534a571-470d-4ec2-8ca7-91c28895aa0a","Type":"ContainerDied","Data":"49eac4839e7a93c49c77dcc93e0216cce7e8063d4fab62a311983e1e196a94eb"} Feb 20 17:13:22 crc kubenswrapper[4697]: I0220 17:13:22.933189 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfb9p" event={"ID":"5534a571-470d-4ec2-8ca7-91c28895aa0a","Type":"ContainerStarted","Data":"30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83"} Feb 20 17:13:22 crc kubenswrapper[4697]: I0220 17:13:22.951203 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sfb9p" podStartSLOduration=2.486694531 podStartE2EDuration="7.951187808s" podCreationTimestamp="2026-02-20 17:13:15 +0000 UTC" firstStartedPulling="2026-02-20 17:13:16.869109473 +0000 UTC m=+2504.649154881" lastFinishedPulling="2026-02-20 17:13:22.33360276 +0000 UTC m=+2510.113648158" observedRunningTime="2026-02-20 17:13:22.948785998 +0000 UTC m=+2510.728831406" watchObservedRunningTime="2026-02-20 17:13:22.951187808 +0000 UTC m=+2510.731233216" Feb 20 17:13:25 crc kubenswrapper[4697]: I0220 17:13:25.927294 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:25 crc kubenswrapper[4697]: I0220 17:13:25.928990 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:26 crc kubenswrapper[4697]: I0220 17:13:26.998717 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sfb9p" podUID="5534a571-470d-4ec2-8ca7-91c28895aa0a" containerName="registry-server" probeResult="failure" output=< Feb 20 17:13:26 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 17:13:26 crc kubenswrapper[4697]: > Feb 20 17:13:35 crc kubenswrapper[4697]: I0220 17:13:35.981764 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:36 crc kubenswrapper[4697]: I0220 17:13:36.038194 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:36 crc kubenswrapper[4697]: I0220 17:13:36.220426 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sfb9p"] Feb 20 17:13:37 crc kubenswrapper[4697]: I0220 17:13:37.066612 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sfb9p" podUID="5534a571-470d-4ec2-8ca7-91c28895aa0a" containerName="registry-server" containerID="cri-o://30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83" gracePeriod=2 Feb 20 17:13:37 crc kubenswrapper[4697]: I0220 17:13:37.514021 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:37 crc kubenswrapper[4697]: I0220 17:13:37.646513 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534a571-470d-4ec2-8ca7-91c28895aa0a-catalog-content\") pod \"5534a571-470d-4ec2-8ca7-91c28895aa0a\" (UID: \"5534a571-470d-4ec2-8ca7-91c28895aa0a\") " Feb 20 17:13:37 crc kubenswrapper[4697]: I0220 17:13:37.646693 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26gw7\" (UniqueName: \"kubernetes.io/projected/5534a571-470d-4ec2-8ca7-91c28895aa0a-kube-api-access-26gw7\") pod \"5534a571-470d-4ec2-8ca7-91c28895aa0a\" (UID: \"5534a571-470d-4ec2-8ca7-91c28895aa0a\") " Feb 20 17:13:37 crc kubenswrapper[4697]: I0220 17:13:37.646771 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534a571-470d-4ec2-8ca7-91c28895aa0a-utilities\") pod \"5534a571-470d-4ec2-8ca7-91c28895aa0a\" (UID: \"5534a571-470d-4ec2-8ca7-91c28895aa0a\") " Feb 20 17:13:37 crc kubenswrapper[4697]: I0220 17:13:37.648018 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5534a571-470d-4ec2-8ca7-91c28895aa0a-utilities" (OuterVolumeSpecName: "utilities") pod "5534a571-470d-4ec2-8ca7-91c28895aa0a" (UID: "5534a571-470d-4ec2-8ca7-91c28895aa0a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:13:37 crc kubenswrapper[4697]: I0220 17:13:37.660232 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5534a571-470d-4ec2-8ca7-91c28895aa0a-kube-api-access-26gw7" (OuterVolumeSpecName: "kube-api-access-26gw7") pod "5534a571-470d-4ec2-8ca7-91c28895aa0a" (UID: "5534a571-470d-4ec2-8ca7-91c28895aa0a"). InnerVolumeSpecName "kube-api-access-26gw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:13:37 crc kubenswrapper[4697]: I0220 17:13:37.748818 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26gw7\" (UniqueName: \"kubernetes.io/projected/5534a571-470d-4ec2-8ca7-91c28895aa0a-kube-api-access-26gw7\") on node \"crc\" DevicePath \"\"" Feb 20 17:13:37 crc kubenswrapper[4697]: I0220 17:13:37.749183 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534a571-470d-4ec2-8ca7-91c28895aa0a-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:13:37 crc kubenswrapper[4697]: I0220 17:13:37.772388 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5534a571-470d-4ec2-8ca7-91c28895aa0a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5534a571-470d-4ec2-8ca7-91c28895aa0a" (UID: "5534a571-470d-4ec2-8ca7-91c28895aa0a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:13:37 crc kubenswrapper[4697]: I0220 17:13:37.851561 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534a571-470d-4ec2-8ca7-91c28895aa0a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.078230 4697 generic.go:334] "Generic (PLEG): container finished" podID="5534a571-470d-4ec2-8ca7-91c28895aa0a" containerID="30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83" exitCode=0 Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.078284 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sfb9p" Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.078303 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfb9p" event={"ID":"5534a571-470d-4ec2-8ca7-91c28895aa0a","Type":"ContainerDied","Data":"30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83"} Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.079482 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfb9p" event={"ID":"5534a571-470d-4ec2-8ca7-91c28895aa0a","Type":"ContainerDied","Data":"b20edd88633ebd1c38883bbe7a9ba19e10031656feb7ae22545cd3dd18567354"} Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.079501 4697 scope.go:117] "RemoveContainer" containerID="30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83" Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.105761 4697 scope.go:117] "RemoveContainer" containerID="49eac4839e7a93c49c77dcc93e0216cce7e8063d4fab62a311983e1e196a94eb" Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.129493 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sfb9p"] Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.130397 4697 scope.go:117] "RemoveContainer" containerID="a4971eb2ec03f6c41ad65aba29e6ddd9e596e088b560016122a47e21cc605a07" Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.142635 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sfb9p"] Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.208617 4697 scope.go:117] "RemoveContainer" containerID="30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83" Feb 20 17:13:38 crc kubenswrapper[4697]: E0220 17:13:38.209115 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83\": container with ID starting with 30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83 not found: ID does not exist" containerID="30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83" Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.209149 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83"} err="failed to get container status \"30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83\": rpc error: code = NotFound desc = could not find container \"30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83\": container with ID starting with 30a60a1db4ebda29aa17e689115c143a1dffaec9acdfe9c6104854bee1f72f83 not found: ID does not exist" Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.209175 4697 scope.go:117] "RemoveContainer" containerID="49eac4839e7a93c49c77dcc93e0216cce7e8063d4fab62a311983e1e196a94eb" Feb 20 17:13:38 crc kubenswrapper[4697]: E0220 17:13:38.209565 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49eac4839e7a93c49c77dcc93e0216cce7e8063d4fab62a311983e1e196a94eb\": container with ID starting with 49eac4839e7a93c49c77dcc93e0216cce7e8063d4fab62a311983e1e196a94eb not found: ID does not exist" containerID="49eac4839e7a93c49c77dcc93e0216cce7e8063d4fab62a311983e1e196a94eb" Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.209592 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49eac4839e7a93c49c77dcc93e0216cce7e8063d4fab62a311983e1e196a94eb"} err="failed to get container status \"49eac4839e7a93c49c77dcc93e0216cce7e8063d4fab62a311983e1e196a94eb\": rpc error: code = NotFound desc = could not find container \"49eac4839e7a93c49c77dcc93e0216cce7e8063d4fab62a311983e1e196a94eb\": container with ID starting with 49eac4839e7a93c49c77dcc93e0216cce7e8063d4fab62a311983e1e196a94eb not found: ID does not exist" Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.209608 4697 scope.go:117] "RemoveContainer" containerID="a4971eb2ec03f6c41ad65aba29e6ddd9e596e088b560016122a47e21cc605a07" Feb 20 17:13:38 crc kubenswrapper[4697]: E0220 17:13:38.209936 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4971eb2ec03f6c41ad65aba29e6ddd9e596e088b560016122a47e21cc605a07\": container with ID starting with a4971eb2ec03f6c41ad65aba29e6ddd9e596e088b560016122a47e21cc605a07 not found: ID does not exist" containerID="a4971eb2ec03f6c41ad65aba29e6ddd9e596e088b560016122a47e21cc605a07" Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.209959 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4971eb2ec03f6c41ad65aba29e6ddd9e596e088b560016122a47e21cc605a07"} err="failed to get container status \"a4971eb2ec03f6c41ad65aba29e6ddd9e596e088b560016122a47e21cc605a07\": rpc error: code = NotFound desc = could not find container \"a4971eb2ec03f6c41ad65aba29e6ddd9e596e088b560016122a47e21cc605a07\": container with ID starting with a4971eb2ec03f6c41ad65aba29e6ddd9e596e088b560016122a47e21cc605a07 not found: ID does not exist" Feb 20 17:13:38 crc kubenswrapper[4697]: I0220 17:13:38.888610 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5534a571-470d-4ec2-8ca7-91c28895aa0a" path="/var/lib/kubelet/pods/5534a571-470d-4ec2-8ca7-91c28895aa0a/volumes" Feb 20 17:14:13 crc kubenswrapper[4697]: I0220 17:14:13.448762 4697 generic.go:334] "Generic (PLEG): container finished" podID="62242a65-ea27-495f-aa04-4a274f9e771a" containerID="b99cd6a9b9913fb4a4929bcb99ce23af1fdad21369bb1d4fcb713b37bffbc83c" exitCode=0 Feb 20 17:14:13 crc kubenswrapper[4697]: I0220 17:14:13.448845 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" event={"ID":"62242a65-ea27-495f-aa04-4a274f9e771a","Type":"ContainerDied","Data":"b99cd6a9b9913fb4a4929bcb99ce23af1fdad21369bb1d4fcb713b37bffbc83c"} Feb 20 17:14:14 crc kubenswrapper[4697]: I0220 17:14:14.911555 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.076726 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-1\") pod \"62242a65-ea27-495f-aa04-4a274f9e771a\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.076837 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-migration-ssh-key-1\") pod \"62242a65-ea27-495f-aa04-4a274f9e771a\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.076878 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-migration-ssh-key-0\") pod \"62242a65-ea27-495f-aa04-4a274f9e771a\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.076911 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtdbh\" (UniqueName: \"kubernetes.io/projected/62242a65-ea27-495f-aa04-4a274f9e771a-kube-api-access-wtdbh\") pod \"62242a65-ea27-495f-aa04-4a274f9e771a\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.076942 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-inventory\") pod \"62242a65-ea27-495f-aa04-4a274f9e771a\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.077047 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-combined-ca-bundle\") pod \"62242a65-ea27-495f-aa04-4a274f9e771a\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.077069 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-3\") pod \"62242a65-ea27-495f-aa04-4a274f9e771a\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.077100 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-2\") pod \"62242a65-ea27-495f-aa04-4a274f9e771a\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.077135 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/62242a65-ea27-495f-aa04-4a274f9e771a-nova-extra-config-0\") pod \"62242a65-ea27-495f-aa04-4a274f9e771a\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.077214 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-0\") pod \"62242a65-ea27-495f-aa04-4a274f9e771a\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.077267 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-ssh-key-openstack-edpm-ipam\") pod \"62242a65-ea27-495f-aa04-4a274f9e771a\" (UID: \"62242a65-ea27-495f-aa04-4a274f9e771a\") " Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.082642 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62242a65-ea27-495f-aa04-4a274f9e771a-kube-api-access-wtdbh" (OuterVolumeSpecName: "kube-api-access-wtdbh") pod "62242a65-ea27-495f-aa04-4a274f9e771a" (UID: "62242a65-ea27-495f-aa04-4a274f9e771a"). InnerVolumeSpecName "kube-api-access-wtdbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.097858 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "62242a65-ea27-495f-aa04-4a274f9e771a" (UID: "62242a65-ea27-495f-aa04-4a274f9e771a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.107805 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "62242a65-ea27-495f-aa04-4a274f9e771a" (UID: "62242a65-ea27-495f-aa04-4a274f9e771a"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.109167 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "62242a65-ea27-495f-aa04-4a274f9e771a" (UID: "62242a65-ea27-495f-aa04-4a274f9e771a"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.112924 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "62242a65-ea27-495f-aa04-4a274f9e771a" (UID: "62242a65-ea27-495f-aa04-4a274f9e771a"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.117104 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-inventory" (OuterVolumeSpecName: "inventory") pod "62242a65-ea27-495f-aa04-4a274f9e771a" (UID: "62242a65-ea27-495f-aa04-4a274f9e771a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.119716 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "62242a65-ea27-495f-aa04-4a274f9e771a" (UID: "62242a65-ea27-495f-aa04-4a274f9e771a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.122478 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "62242a65-ea27-495f-aa04-4a274f9e771a" (UID: "62242a65-ea27-495f-aa04-4a274f9e771a"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.123277 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "62242a65-ea27-495f-aa04-4a274f9e771a" (UID: "62242a65-ea27-495f-aa04-4a274f9e771a"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.125004 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62242a65-ea27-495f-aa04-4a274f9e771a-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "62242a65-ea27-495f-aa04-4a274f9e771a" (UID: "62242a65-ea27-495f-aa04-4a274f9e771a"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.127654 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "62242a65-ea27-495f-aa04-4a274f9e771a" (UID: "62242a65-ea27-495f-aa04-4a274f9e771a"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.179876 4697 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.179910 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.179919 4697 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.179929 4697 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.179938 4697 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.179947 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtdbh\" (UniqueName: \"kubernetes.io/projected/62242a65-ea27-495f-aa04-4a274f9e771a-kube-api-access-wtdbh\") on node \"crc\" DevicePath \"\"" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.179956 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.179965 4697 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.179975 4697 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.179983 4697 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/62242a65-ea27-495f-aa04-4a274f9e771a-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.179992 4697 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/62242a65-ea27-495f-aa04-4a274f9e771a-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.469936 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" event={"ID":"62242a65-ea27-495f-aa04-4a274f9e771a","Type":"ContainerDied","Data":"f4552b003a9d0919cb212eb5d043ca95550a1fe351b4ce8ee869efd379a36233"} Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.470002 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wzxdf" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.469986 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4552b003a9d0919cb212eb5d043ca95550a1fe351b4ce8ee869efd379a36233" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.630751 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56"] Feb 20 17:14:15 crc kubenswrapper[4697]: E0220 17:14:15.631208 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361edc64-43a5-479d-8c0b-de859fb1167f" containerName="extract-utilities" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.631230 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="361edc64-43a5-479d-8c0b-de859fb1167f" containerName="extract-utilities" Feb 20 17:14:15 crc kubenswrapper[4697]: E0220 17:14:15.631250 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5534a571-470d-4ec2-8ca7-91c28895aa0a" containerName="registry-server" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.631260 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="5534a571-470d-4ec2-8ca7-91c28895aa0a" containerName="registry-server" Feb 20 17:14:15 crc kubenswrapper[4697]: E0220 17:14:15.631277 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62242a65-ea27-495f-aa04-4a274f9e771a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.631286 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="62242a65-ea27-495f-aa04-4a274f9e771a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 20 17:14:15 crc kubenswrapper[4697]: E0220 17:14:15.631301 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361edc64-43a5-479d-8c0b-de859fb1167f" containerName="extract-content" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.631308 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="361edc64-43a5-479d-8c0b-de859fb1167f" containerName="extract-content" Feb 20 17:14:15 crc kubenswrapper[4697]: E0220 17:14:15.631326 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5534a571-470d-4ec2-8ca7-91c28895aa0a" containerName="extract-utilities" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.631334 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="5534a571-470d-4ec2-8ca7-91c28895aa0a" containerName="extract-utilities" Feb 20 17:14:15 crc kubenswrapper[4697]: E0220 17:14:15.631353 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5534a571-470d-4ec2-8ca7-91c28895aa0a" containerName="extract-content" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.631360 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="5534a571-470d-4ec2-8ca7-91c28895aa0a" containerName="extract-content" Feb 20 17:14:15 crc kubenswrapper[4697]: E0220 17:14:15.631375 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361edc64-43a5-479d-8c0b-de859fb1167f" containerName="registry-server" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.631384 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="361edc64-43a5-479d-8c0b-de859fb1167f" containerName="registry-server" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.631636 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="62242a65-ea27-495f-aa04-4a274f9e771a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.631661 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="5534a571-470d-4ec2-8ca7-91c28895aa0a" containerName="registry-server" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.631682 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="361edc64-43a5-479d-8c0b-de859fb1167f" containerName="registry-server" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.632405 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.636359 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.636783 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.636881 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.636956 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.637163 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9psxc" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.647901 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56"] Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.795198 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.795256 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.795275 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.795395 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.795680 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm4xw\" (UniqueName: \"kubernetes.io/projected/db0ef420-9810-4b2f-8f10-b3fb710293c6-kube-api-access-wm4xw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.796063 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.796204 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.897392 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.897479 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.897519 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.897546 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.897562 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.897580 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.897631 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm4xw\" (UniqueName: \"kubernetes.io/projected/db0ef420-9810-4b2f-8f10-b3fb710293c6-kube-api-access-wm4xw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.901812 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.902941 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.903939 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.904062 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.904213 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.904422 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.917160 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm4xw\" (UniqueName: \"kubernetes.io/projected/db0ef420-9810-4b2f-8f10-b3fb710293c6-kube-api-access-wm4xw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9st56\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:15 crc kubenswrapper[4697]: I0220 17:14:15.958169 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:14:16 crc kubenswrapper[4697]: I0220 17:14:16.545374 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56"] Feb 20 17:14:17 crc kubenswrapper[4697]: I0220 17:14:17.484993 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" event={"ID":"db0ef420-9810-4b2f-8f10-b3fb710293c6","Type":"ContainerStarted","Data":"dfb442a214f9e93a884ee102dee2ee226eb95eb2c09d6500b21f2570e9a4e799"} Feb 20 17:14:17 crc kubenswrapper[4697]: I0220 17:14:17.485391 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" event={"ID":"db0ef420-9810-4b2f-8f10-b3fb710293c6","Type":"ContainerStarted","Data":"e2eeecbce3640ac33796d792beb32abc4c8e5b934a6a2496da7b40a223688695"} Feb 20 17:14:17 crc kubenswrapper[4697]: I0220 17:14:17.504402 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" podStartSLOduration=2.094932812 podStartE2EDuration="2.50438346s" podCreationTimestamp="2026-02-20 17:14:15 +0000 UTC" firstStartedPulling="2026-02-20 17:14:16.5519214 +0000 UTC m=+2564.331966808" lastFinishedPulling="2026-02-20 17:14:16.961372038 +0000 UTC m=+2564.741417456" observedRunningTime="2026-02-20 17:14:17.499854037 +0000 UTC m=+2565.279899445" watchObservedRunningTime="2026-02-20 17:14:17.50438346 +0000 UTC m=+2565.284428878" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.167690 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv"] Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.169958 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.172950 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.174949 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.187476 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv"] Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.292406 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-config-volume\") pod \"collect-profiles-29526795-ttvtv\" (UID: \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.292533 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-secret-volume\") pod \"collect-profiles-29526795-ttvtv\" (UID: \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.292613 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcfj6\" (UniqueName: \"kubernetes.io/projected/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-kube-api-access-xcfj6\") pod \"collect-profiles-29526795-ttvtv\" (UID: \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.394183 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-secret-volume\") pod \"collect-profiles-29526795-ttvtv\" (UID: \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.394227 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcfj6\" (UniqueName: \"kubernetes.io/projected/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-kube-api-access-xcfj6\") pod \"collect-profiles-29526795-ttvtv\" (UID: \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.394359 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-config-volume\") pod \"collect-profiles-29526795-ttvtv\" (UID: \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.395289 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-config-volume\") pod \"collect-profiles-29526795-ttvtv\" (UID: \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.412252 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-secret-volume\") pod \"collect-profiles-29526795-ttvtv\" (UID: \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.414731 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcfj6\" (UniqueName: \"kubernetes.io/projected/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-kube-api-access-xcfj6\") pod \"collect-profiles-29526795-ttvtv\" (UID: \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.499317 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:00 crc kubenswrapper[4697]: I0220 17:15:00.951704 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv"] Feb 20 17:15:01 crc kubenswrapper[4697]: I0220 17:15:01.920772 4697 generic.go:334] "Generic (PLEG): container finished" podID="e4f83b20-b4f6-4b44-87f5-12ace92d12fb" containerID="f5e3011ef626e46ac458ed312e9ffb5eefeffe426ec276f3f86f13e9eb0a3a72" exitCode=0 Feb 20 17:15:01 crc kubenswrapper[4697]: I0220 17:15:01.920876 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" event={"ID":"e4f83b20-b4f6-4b44-87f5-12ace92d12fb","Type":"ContainerDied","Data":"f5e3011ef626e46ac458ed312e9ffb5eefeffe426ec276f3f86f13e9eb0a3a72"} Feb 20 17:15:01 crc kubenswrapper[4697]: I0220 17:15:01.921137 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" event={"ID":"e4f83b20-b4f6-4b44-87f5-12ace92d12fb","Type":"ContainerStarted","Data":"208e4515725c32fe115c7e78d9266df18d929e5e6d2d4089ea9538e7575b1393"} Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.391563 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.558373 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-secret-volume\") pod \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\" (UID: \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\") " Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.558583 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcfj6\" (UniqueName: \"kubernetes.io/projected/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-kube-api-access-xcfj6\") pod \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\" (UID: \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\") " Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.558660 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-config-volume\") pod \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\" (UID: \"e4f83b20-b4f6-4b44-87f5-12ace92d12fb\") " Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.559530 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-config-volume" (OuterVolumeSpecName: "config-volume") pod "e4f83b20-b4f6-4b44-87f5-12ace92d12fb" (UID: "e4f83b20-b4f6-4b44-87f5-12ace92d12fb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.563984 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-kube-api-access-xcfj6" (OuterVolumeSpecName: "kube-api-access-xcfj6") pod "e4f83b20-b4f6-4b44-87f5-12ace92d12fb" (UID: "e4f83b20-b4f6-4b44-87f5-12ace92d12fb"). InnerVolumeSpecName "kube-api-access-xcfj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.566494 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e4f83b20-b4f6-4b44-87f5-12ace92d12fb" (UID: "e4f83b20-b4f6-4b44-87f5-12ace92d12fb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.661138 4697 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.661176 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcfj6\" (UniqueName: \"kubernetes.io/projected/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-kube-api-access-xcfj6\") on node \"crc\" DevicePath \"\"" Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.661186 4697 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4f83b20-b4f6-4b44-87f5-12ace92d12fb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.940687 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" event={"ID":"e4f83b20-b4f6-4b44-87f5-12ace92d12fb","Type":"ContainerDied","Data":"208e4515725c32fe115c7e78d9266df18d929e5e6d2d4089ea9538e7575b1393"} Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.940724 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="208e4515725c32fe115c7e78d9266df18d929e5e6d2d4089ea9538e7575b1393" Feb 20 17:15:03 crc kubenswrapper[4697]: I0220 17:15:03.940969 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv" Feb 20 17:15:04 crc kubenswrapper[4697]: I0220 17:15:04.466306 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8"] Feb 20 17:15:04 crc kubenswrapper[4697]: I0220 17:15:04.474341 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526750-77rx8"] Feb 20 17:15:04 crc kubenswrapper[4697]: I0220 17:15:04.889299 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95906860-29b3-49a2-b903-37b9a5a808a5" path="/var/lib/kubelet/pods/95906860-29b3-49a2-b903-37b9a5a808a5/volumes" Feb 20 17:15:31 crc kubenswrapper[4697]: I0220 17:15:31.184743 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:15:31 crc kubenswrapper[4697]: I0220 17:15:31.185333 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:15:46 crc kubenswrapper[4697]: I0220 17:15:46.151493 4697 scope.go:117] "RemoveContainer" containerID="4464255c5a1c57a8e2a67aaf79c6b738fc5cacdb33c0ae9c5950c814b4acf82a" Feb 20 17:16:01 crc kubenswrapper[4697]: I0220 17:16:01.184338 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:16:01 crc kubenswrapper[4697]: I0220 17:16:01.184935 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:16:20 crc kubenswrapper[4697]: I0220 17:16:20.704151 4697 generic.go:334] "Generic (PLEG): container finished" podID="db0ef420-9810-4b2f-8f10-b3fb710293c6" containerID="dfb442a214f9e93a884ee102dee2ee226eb95eb2c09d6500b21f2570e9a4e799" exitCode=0 Feb 20 17:16:20 crc kubenswrapper[4697]: I0220 17:16:20.704178 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" event={"ID":"db0ef420-9810-4b2f-8f10-b3fb710293c6","Type":"ContainerDied","Data":"dfb442a214f9e93a884ee102dee2ee226eb95eb2c09d6500b21f2570e9a4e799"} Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.106882 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.229832 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-2\") pod \"db0ef420-9810-4b2f-8f10-b3fb710293c6\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.230190 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-inventory\") pod \"db0ef420-9810-4b2f-8f10-b3fb710293c6\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.230246 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ssh-key-openstack-edpm-ipam\") pod \"db0ef420-9810-4b2f-8f10-b3fb710293c6\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.230291 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-1\") pod \"db0ef420-9810-4b2f-8f10-b3fb710293c6\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.230366 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-telemetry-combined-ca-bundle\") pod \"db0ef420-9810-4b2f-8f10-b3fb710293c6\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.230409 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-0\") pod \"db0ef420-9810-4b2f-8f10-b3fb710293c6\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.230472 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm4xw\" (UniqueName: \"kubernetes.io/projected/db0ef420-9810-4b2f-8f10-b3fb710293c6-kube-api-access-wm4xw\") pod \"db0ef420-9810-4b2f-8f10-b3fb710293c6\" (UID: \"db0ef420-9810-4b2f-8f10-b3fb710293c6\") " Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.236210 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "db0ef420-9810-4b2f-8f10-b3fb710293c6" (UID: "db0ef420-9810-4b2f-8f10-b3fb710293c6"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.247616 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db0ef420-9810-4b2f-8f10-b3fb710293c6-kube-api-access-wm4xw" (OuterVolumeSpecName: "kube-api-access-wm4xw") pod "db0ef420-9810-4b2f-8f10-b3fb710293c6" (UID: "db0ef420-9810-4b2f-8f10-b3fb710293c6"). InnerVolumeSpecName "kube-api-access-wm4xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.257607 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "db0ef420-9810-4b2f-8f10-b3fb710293c6" (UID: "db0ef420-9810-4b2f-8f10-b3fb710293c6"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.264354 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "db0ef420-9810-4b2f-8f10-b3fb710293c6" (UID: "db0ef420-9810-4b2f-8f10-b3fb710293c6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.265091 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-inventory" (OuterVolumeSpecName: "inventory") pod "db0ef420-9810-4b2f-8f10-b3fb710293c6" (UID: "db0ef420-9810-4b2f-8f10-b3fb710293c6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.265124 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "db0ef420-9810-4b2f-8f10-b3fb710293c6" (UID: "db0ef420-9810-4b2f-8f10-b3fb710293c6"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.265510 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "db0ef420-9810-4b2f-8f10-b3fb710293c6" (UID: "db0ef420-9810-4b2f-8f10-b3fb710293c6"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.333119 4697 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.333154 4697 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.333165 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.333176 4697 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.333186 4697 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.333195 4697 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/db0ef420-9810-4b2f-8f10-b3fb710293c6-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.333203 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm4xw\" (UniqueName: \"kubernetes.io/projected/db0ef420-9810-4b2f-8f10-b3fb710293c6-kube-api-access-wm4xw\") on node \"crc\" DevicePath \"\"" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.724448 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" event={"ID":"db0ef420-9810-4b2f-8f10-b3fb710293c6","Type":"ContainerDied","Data":"e2eeecbce3640ac33796d792beb32abc4c8e5b934a6a2496da7b40a223688695"} Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.724493 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2eeecbce3640ac33796d792beb32abc4c8e5b934a6a2496da7b40a223688695" Feb 20 17:16:22 crc kubenswrapper[4697]: I0220 17:16:22.724521 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9st56" Feb 20 17:16:31 crc kubenswrapper[4697]: I0220 17:16:31.184902 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:16:31 crc kubenswrapper[4697]: I0220 17:16:31.185759 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:16:31 crc kubenswrapper[4697]: I0220 17:16:31.185834 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 17:16:31 crc kubenswrapper[4697]: I0220 17:16:31.187124 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d496f5c2a2fdecd518eeefb67465f7013c83166b86974cefe0c57b35aaead31"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 17:16:31 crc kubenswrapper[4697]: I0220 17:16:31.187294 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://9d496f5c2a2fdecd518eeefb67465f7013c83166b86974cefe0c57b35aaead31" gracePeriod=600 Feb 20 17:16:31 crc kubenswrapper[4697]: I0220 17:16:31.814561 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="9d496f5c2a2fdecd518eeefb67465f7013c83166b86974cefe0c57b35aaead31" exitCode=0 Feb 20 17:16:31 crc kubenswrapper[4697]: I0220 17:16:31.814643 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"9d496f5c2a2fdecd518eeefb67465f7013c83166b86974cefe0c57b35aaead31"} Feb 20 17:16:31 crc kubenswrapper[4697]: I0220 17:16:31.814979 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87"} Feb 20 17:16:31 crc kubenswrapper[4697]: I0220 17:16:31.814996 4697 scope.go:117] "RemoveContainer" containerID="ffd74d59a61840d3b70a0e93393e26619d95650fedd6f121bdc6370c6bd381d3" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.812284 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 20 17:16:55 crc kubenswrapper[4697]: E0220 17:16:55.813163 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db0ef420-9810-4b2f-8f10-b3fb710293c6" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.813177 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="db0ef420-9810-4b2f-8f10-b3fb710293c6" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 20 17:16:55 crc kubenswrapper[4697]: E0220 17:16:55.813200 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4f83b20-b4f6-4b44-87f5-12ace92d12fb" containerName="collect-profiles" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.813205 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4f83b20-b4f6-4b44-87f5-12ace92d12fb" containerName="collect-profiles" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.813404 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="db0ef420-9810-4b2f-8f10-b3fb710293c6" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.813423 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4f83b20-b4f6-4b44-87f5-12ace92d12fb" containerName="collect-profiles" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.814417 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.819754 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.827326 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.882397 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.884244 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.886069 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.895445 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.949634 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.949687 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-sys\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.949705 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9192fe2-573e-4d32-915c-535887423540-scripts\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.949743 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.949769 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-lib-modules\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.949837 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr4jq\" (UniqueName: \"kubernetes.io/projected/0ca8173d-4029-4a57-80c9-c63c05842bb5-kube-api-access-zr4jq\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.949875 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.949905 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ca8173d-4029-4a57-80c9-c63c05842bb5-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.949936 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9192fe2-573e-4d32-915c-535887423540-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.949960 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-dev\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.949978 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.949999 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950035 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-sys\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950056 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ca8173d-4029-4a57-80c9-c63c05842bb5-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950078 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4ffj\" (UniqueName: \"kubernetes.io/projected/e9192fe2-573e-4d32-915c-535887423540-kube-api-access-s4ffj\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950115 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950143 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950160 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950179 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950227 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ca8173d-4029-4a57-80c9-c63c05842bb5-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950246 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9192fe2-573e-4d32-915c-535887423540-config-data\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950263 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-dev\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950306 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950333 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca8173d-4029-4a57-80c9-c63c05842bb5-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950352 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-run\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950370 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950393 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-run\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950427 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9192fe2-573e-4d32-915c-535887423540-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950483 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.950506 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:55 crc kubenswrapper[4697]: I0220 17:16:55.998644 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.000727 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.002364 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.016255 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052115 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052172 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9192fe2-573e-4d32-915c-535887423540-scripts\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052198 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-sys\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052237 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052265 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052284 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-lib-modules\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052313 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052317 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-sys\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052339 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr4jq\" (UniqueName: \"kubernetes.io/projected/0ca8173d-4029-4a57-80c9-c63c05842bb5-kube-api-access-zr4jq\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052392 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-lib-modules\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052465 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052488 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052542 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052576 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ca8173d-4029-4a57-80c9-c63c05842bb5-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052635 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9192fe2-573e-4d32-915c-535887423540-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052661 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-dev\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052677 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052679 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052701 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052721 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052735 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-dev\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052752 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052792 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbrmf\" (UniqueName: \"kubernetes.io/projected/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-kube-api-access-gbrmf\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052824 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-sys\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052850 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ca8173d-4029-4a57-80c9-c63c05842bb5-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052863 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052904 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-sys\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052875 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4ffj\" (UniqueName: \"kubernetes.io/projected/e9192fe2-573e-4d32-915c-535887423540-kube-api-access-s4ffj\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.052966 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053005 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053023 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053062 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053086 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053103 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053120 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053176 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053196 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ca8173d-4029-4a57-80c9-c63c05842bb5-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053212 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9192fe2-573e-4d32-915c-535887423540-config-data\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053256 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053265 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-dev\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053290 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053290 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053309 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053314 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053340 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053366 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-dev\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053539 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053548 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053583 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053590 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053621 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca8173d-4029-4a57-80c9-c63c05842bb5-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053638 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-run\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053659 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053684 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-run\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053716 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053723 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-run\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053778 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053803 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-run\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053733 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9192fe2-573e-4d32-915c-535887423540-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053857 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053885 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053916 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.053956 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.054058 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0ca8173d-4029-4a57-80c9-c63c05842bb5-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.054135 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e9192fe2-573e-4d32-915c-535887423540-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.058452 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ca8173d-4029-4a57-80c9-c63c05842bb5-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.060228 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9192fe2-573e-4d32-915c-535887423540-scripts\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.061058 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca8173d-4029-4a57-80c9-c63c05842bb5-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.062997 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ca8173d-4029-4a57-80c9-c63c05842bb5-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.070186 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9192fe2-573e-4d32-915c-535887423540-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.070806 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ca8173d-4029-4a57-80c9-c63c05842bb5-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.071075 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9192fe2-573e-4d32-915c-535887423540-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.072728 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9192fe2-573e-4d32-915c-535887423540-config-data\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.081203 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4ffj\" (UniqueName: \"kubernetes.io/projected/e9192fe2-573e-4d32-915c-535887423540-kube-api-access-s4ffj\") pod \"cinder-backup-0\" (UID: \"e9192fe2-573e-4d32-915c-535887423540\") " pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.083074 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr4jq\" (UniqueName: \"kubernetes.io/projected/0ca8173d-4029-4a57-80c9-c63c05842bb5-kube-api-access-zr4jq\") pod \"cinder-volume-nfs-0\" (UID: \"0ca8173d-4029-4a57-80c9-c63c05842bb5\") " pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.134985 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.155919 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.155973 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156012 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156029 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156071 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156099 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156117 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156147 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156194 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156234 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156255 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156306 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156335 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbrmf\" (UniqueName: \"kubernetes.io/projected/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-kube-api-access-gbrmf\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156381 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156407 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156528 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.156094 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.157377 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.157392 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.157419 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.157374 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.157445 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.157496 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.157502 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.157530 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.161160 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.161268 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.162121 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.163996 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.176323 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbrmf\" (UniqueName: \"kubernetes.io/projected/59ea396a-fa78-4a1a-95e5-48f3e4e49bda-kube-api-access-gbrmf\") pod \"cinder-volume-nfs-2-0\" (UID: \"59ea396a-fa78-4a1a-95e5-48f3e4e49bda\") " pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.201522 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.326235 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.717870 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.724035 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 17:16:56 crc kubenswrapper[4697]: I0220 17:16:56.994545 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 20 17:16:57 crc kubenswrapper[4697]: W0220 17:16:57.018090 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ca8173d_4029_4a57_80c9_c63c05842bb5.slice/crio-44b82bc7a0c6f929a9ecbc7c1929b1a78ecefd8a61c5077917fc52f1f70d8704 WatchSource:0}: Error finding container 44b82bc7a0c6f929a9ecbc7c1929b1a78ecefd8a61c5077917fc52f1f70d8704: Status 404 returned error can't find the container with id 44b82bc7a0c6f929a9ecbc7c1929b1a78ecefd8a61c5077917fc52f1f70d8704 Feb 20 17:16:57 crc kubenswrapper[4697]: I0220 17:16:57.063972 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 20 17:16:57 crc kubenswrapper[4697]: W0220 17:16:57.076669 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59ea396a_fa78_4a1a_95e5_48f3e4e49bda.slice/crio-eeb681b136395aff72e060af9563d2ceecc99306ffe7e0fd4131723e14dbc697 WatchSource:0}: Error finding container eeb681b136395aff72e060af9563d2ceecc99306ffe7e0fd4131723e14dbc697: Status 404 returned error can't find the container with id eeb681b136395aff72e060af9563d2ceecc99306ffe7e0fd4131723e14dbc697 Feb 20 17:16:57 crc kubenswrapper[4697]: I0220 17:16:57.085944 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e9192fe2-573e-4d32-915c-535887423540","Type":"ContainerStarted","Data":"cb5665344dd516e9259318dc6e8dd3ad8adc7a259021e1c8054b3d93a15e6043"} Feb 20 17:16:57 crc kubenswrapper[4697]: I0220 17:16:57.087526 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"0ca8173d-4029-4a57-80c9-c63c05842bb5","Type":"ContainerStarted","Data":"44b82bc7a0c6f929a9ecbc7c1929b1a78ecefd8a61c5077917fc52f1f70d8704"} Feb 20 17:16:57 crc kubenswrapper[4697]: I0220 17:16:57.089076 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"59ea396a-fa78-4a1a-95e5-48f3e4e49bda","Type":"ContainerStarted","Data":"eeb681b136395aff72e060af9563d2ceecc99306ffe7e0fd4131723e14dbc697"} Feb 20 17:16:58 crc kubenswrapper[4697]: I0220 17:16:58.098476 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e9192fe2-573e-4d32-915c-535887423540","Type":"ContainerStarted","Data":"8b5f9d858b401d621f050d1e1f2f14d0f8a3997971cdadc3d10c23be500b013c"} Feb 20 17:16:58 crc kubenswrapper[4697]: I0220 17:16:58.099193 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e9192fe2-573e-4d32-915c-535887423540","Type":"ContainerStarted","Data":"39889ce0ef904622913b57c3fc02cc933ebb6604852f2c1c7773b2a305261b69"} Feb 20 17:16:58 crc kubenswrapper[4697]: I0220 17:16:58.101077 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"0ca8173d-4029-4a57-80c9-c63c05842bb5","Type":"ContainerStarted","Data":"f15416412d0cf82c9cd516874571ee047ec7f8d376a392403b76aa1a907df9bc"} Feb 20 17:16:58 crc kubenswrapper[4697]: I0220 17:16:58.101196 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"0ca8173d-4029-4a57-80c9-c63c05842bb5","Type":"ContainerStarted","Data":"7ae261ccd8cc0d7b9f94067b57a49d9f93fe834e80e50ac416f8e53820b01dc3"} Feb 20 17:16:58 crc kubenswrapper[4697]: I0220 17:16:58.104639 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"59ea396a-fa78-4a1a-95e5-48f3e4e49bda","Type":"ContainerStarted","Data":"7e4b75f747f7ff7aaff9a591e3dc22b80917b9cfc9a517467964e85a37b1aba4"} Feb 20 17:16:58 crc kubenswrapper[4697]: I0220 17:16:58.104685 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"59ea396a-fa78-4a1a-95e5-48f3e4e49bda","Type":"ContainerStarted","Data":"05d454c6b68335fed298fa06cba39f3360bc3a72490f06051b06b10669d179cc"} Feb 20 17:16:58 crc kubenswrapper[4697]: I0220 17:16:58.150819 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.8518535739999997 podStartE2EDuration="3.15080223s" podCreationTimestamp="2026-02-20 17:16:55 +0000 UTC" firstStartedPulling="2026-02-20 17:16:56.723819808 +0000 UTC m=+2724.503865216" lastFinishedPulling="2026-02-20 17:16:57.022768464 +0000 UTC m=+2724.802813872" observedRunningTime="2026-02-20 17:16:58.139794682 +0000 UTC m=+2725.919840090" watchObservedRunningTime="2026-02-20 17:16:58.15080223 +0000 UTC m=+2725.930847638" Feb 20 17:16:58 crc kubenswrapper[4697]: I0220 17:16:58.174583 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=2.908251085 podStartE2EDuration="3.174559617s" podCreationTimestamp="2026-02-20 17:16:55 +0000 UTC" firstStartedPulling="2026-02-20 17:16:57.020856918 +0000 UTC m=+2724.800902326" lastFinishedPulling="2026-02-20 17:16:57.28716545 +0000 UTC m=+2725.067210858" observedRunningTime="2026-02-20 17:16:58.169365971 +0000 UTC m=+2725.949411379" watchObservedRunningTime="2026-02-20 17:16:58.174559617 +0000 UTC m=+2725.954605025" Feb 20 17:16:58 crc kubenswrapper[4697]: I0220 17:16:58.197754 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=2.9922773879999998 podStartE2EDuration="3.197733601s" podCreationTimestamp="2026-02-20 17:16:55 +0000 UTC" firstStartedPulling="2026-02-20 17:16:57.080530298 +0000 UTC m=+2724.860575706" lastFinishedPulling="2026-02-20 17:16:57.285986511 +0000 UTC m=+2725.066031919" observedRunningTime="2026-02-20 17:16:58.192584615 +0000 UTC m=+2725.972630023" watchObservedRunningTime="2026-02-20 17:16:58.197733601 +0000 UTC m=+2725.977778999" Feb 20 17:17:01 crc kubenswrapper[4697]: I0220 17:17:01.136062 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 20 17:17:01 crc kubenswrapper[4697]: I0220 17:17:01.202630 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Feb 20 17:17:01 crc kubenswrapper[4697]: I0220 17:17:01.326357 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:17:06 crc kubenswrapper[4697]: I0220 17:17:06.311159 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 20 17:17:06 crc kubenswrapper[4697]: I0220 17:17:06.411591 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Feb 20 17:17:06 crc kubenswrapper[4697]: I0220 17:17:06.605908 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.350166 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pbkfz"] Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.353537 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.361368 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pbkfz"] Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.449322 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-utilities\") pod \"community-operators-pbkfz\" (UID: \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\") " pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.449478 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-catalog-content\") pod \"community-operators-pbkfz\" (UID: \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\") " pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.449506 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn8r9\" (UniqueName: \"kubernetes.io/projected/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-kube-api-access-xn8r9\") pod \"community-operators-pbkfz\" (UID: \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\") " pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.551324 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-utilities\") pod \"community-operators-pbkfz\" (UID: \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\") " pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.551427 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-catalog-content\") pod \"community-operators-pbkfz\" (UID: \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\") " pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.551671 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn8r9\" (UniqueName: \"kubernetes.io/projected/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-kube-api-access-xn8r9\") pod \"community-operators-pbkfz\" (UID: \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\") " pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.552055 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-utilities\") pod \"community-operators-pbkfz\" (UID: \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\") " pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.552092 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-catalog-content\") pod \"community-operators-pbkfz\" (UID: \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\") " pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.573478 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn8r9\" (UniqueName: \"kubernetes.io/projected/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-kube-api-access-xn8r9\") pod \"community-operators-pbkfz\" (UID: \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\") " pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:47 crc kubenswrapper[4697]: I0220 17:17:47.692112 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:48 crc kubenswrapper[4697]: I0220 17:17:48.289407 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pbkfz"] Feb 20 17:17:48 crc kubenswrapper[4697]: W0220 17:17:48.300199 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode64a199d_6d38_43d4_bf8c_b28e9601d0bd.slice/crio-35aa174641ef0aac4184797bc48cef60b4efb5bff0c2a9558b9fa72a56ac3f1f WatchSource:0}: Error finding container 35aa174641ef0aac4184797bc48cef60b4efb5bff0c2a9558b9fa72a56ac3f1f: Status 404 returned error can't find the container with id 35aa174641ef0aac4184797bc48cef60b4efb5bff0c2a9558b9fa72a56ac3f1f Feb 20 17:17:48 crc kubenswrapper[4697]: I0220 17:17:48.607101 4697 generic.go:334] "Generic (PLEG): container finished" podID="e64a199d-6d38-43d4-bf8c-b28e9601d0bd" containerID="041b538c724428269e8af68fc4d294699c826200d03736799d270c8cb58f5de5" exitCode=0 Feb 20 17:17:48 crc kubenswrapper[4697]: I0220 17:17:48.607199 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbkfz" event={"ID":"e64a199d-6d38-43d4-bf8c-b28e9601d0bd","Type":"ContainerDied","Data":"041b538c724428269e8af68fc4d294699c826200d03736799d270c8cb58f5de5"} Feb 20 17:17:48 crc kubenswrapper[4697]: I0220 17:17:48.607406 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbkfz" event={"ID":"e64a199d-6d38-43d4-bf8c-b28e9601d0bd","Type":"ContainerStarted","Data":"35aa174641ef0aac4184797bc48cef60b4efb5bff0c2a9558b9fa72a56ac3f1f"} Feb 20 17:17:50 crc kubenswrapper[4697]: I0220 17:17:50.626664 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbkfz" event={"ID":"e64a199d-6d38-43d4-bf8c-b28e9601d0bd","Type":"ContainerStarted","Data":"e33f375b0f5f785f5f4cdf94ee23ed63872e404388e18af8aba292e1a3e97490"} Feb 20 17:17:51 crc kubenswrapper[4697]: I0220 17:17:51.641237 4697 generic.go:334] "Generic (PLEG): container finished" podID="e64a199d-6d38-43d4-bf8c-b28e9601d0bd" containerID="e33f375b0f5f785f5f4cdf94ee23ed63872e404388e18af8aba292e1a3e97490" exitCode=0 Feb 20 17:17:51 crc kubenswrapper[4697]: I0220 17:17:51.641308 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbkfz" event={"ID":"e64a199d-6d38-43d4-bf8c-b28e9601d0bd","Type":"ContainerDied","Data":"e33f375b0f5f785f5f4cdf94ee23ed63872e404388e18af8aba292e1a3e97490"} Feb 20 17:17:52 crc kubenswrapper[4697]: I0220 17:17:52.652893 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbkfz" event={"ID":"e64a199d-6d38-43d4-bf8c-b28e9601d0bd","Type":"ContainerStarted","Data":"e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2"} Feb 20 17:17:52 crc kubenswrapper[4697]: I0220 17:17:52.676950 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pbkfz" podStartSLOduration=2.208657767 podStartE2EDuration="5.676933009s" podCreationTimestamp="2026-02-20 17:17:47 +0000 UTC" firstStartedPulling="2026-02-20 17:17:48.610596252 +0000 UTC m=+2776.390641660" lastFinishedPulling="2026-02-20 17:17:52.078871494 +0000 UTC m=+2779.858916902" observedRunningTime="2026-02-20 17:17:52.670928443 +0000 UTC m=+2780.450973851" watchObservedRunningTime="2026-02-20 17:17:52.676933009 +0000 UTC m=+2780.456978417" Feb 20 17:17:57 crc kubenswrapper[4697]: I0220 17:17:57.692204 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:57 crc kubenswrapper[4697]: I0220 17:17:57.693193 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:57 crc kubenswrapper[4697]: I0220 17:17:57.753380 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:17:58 crc kubenswrapper[4697]: I0220 17:17:58.828233 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.242310 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pbkfz"] Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.243198 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pbkfz" podUID="e64a199d-6d38-43d4-bf8c-b28e9601d0bd" containerName="registry-server" containerID="cri-o://e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2" gracePeriod=2 Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.729276 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.769622 4697 generic.go:334] "Generic (PLEG): container finished" podID="e64a199d-6d38-43d4-bf8c-b28e9601d0bd" containerID="e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2" exitCode=0 Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.769668 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbkfz" event={"ID":"e64a199d-6d38-43d4-bf8c-b28e9601d0bd","Type":"ContainerDied","Data":"e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2"} Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.769702 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbkfz" event={"ID":"e64a199d-6d38-43d4-bf8c-b28e9601d0bd","Type":"ContainerDied","Data":"35aa174641ef0aac4184797bc48cef60b4efb5bff0c2a9558b9fa72a56ac3f1f"} Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.769718 4697 scope.go:117] "RemoveContainer" containerID="e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.769838 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbkfz" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.799156 4697 scope.go:117] "RemoveContainer" containerID="e33f375b0f5f785f5f4cdf94ee23ed63872e404388e18af8aba292e1a3e97490" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.818080 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-catalog-content\") pod \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\" (UID: \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\") " Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.818311 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-utilities\") pod \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\" (UID: \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\") " Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.818395 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn8r9\" (UniqueName: \"kubernetes.io/projected/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-kube-api-access-xn8r9\") pod \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\" (UID: \"e64a199d-6d38-43d4-bf8c-b28e9601d0bd\") " Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.820504 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-utilities" (OuterVolumeSpecName: "utilities") pod "e64a199d-6d38-43d4-bf8c-b28e9601d0bd" (UID: "e64a199d-6d38-43d4-bf8c-b28e9601d0bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.827648 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-kube-api-access-xn8r9" (OuterVolumeSpecName: "kube-api-access-xn8r9") pod "e64a199d-6d38-43d4-bf8c-b28e9601d0bd" (UID: "e64a199d-6d38-43d4-bf8c-b28e9601d0bd"). InnerVolumeSpecName "kube-api-access-xn8r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.844721 4697 scope.go:117] "RemoveContainer" containerID="041b538c724428269e8af68fc4d294699c826200d03736799d270c8cb58f5de5" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.878209 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e64a199d-6d38-43d4-bf8c-b28e9601d0bd" (UID: "e64a199d-6d38-43d4-bf8c-b28e9601d0bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.922126 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn8r9\" (UniqueName: \"kubernetes.io/projected/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-kube-api-access-xn8r9\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.922901 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.923008 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64a199d-6d38-43d4-bf8c-b28e9601d0bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.936510 4697 scope.go:117] "RemoveContainer" containerID="e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2" Feb 20 17:18:02 crc kubenswrapper[4697]: E0220 17:18:02.937573 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2\": container with ID starting with e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2 not found: ID does not exist" containerID="e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.937692 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2"} err="failed to get container status \"e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2\": rpc error: code = NotFound desc = could not find container \"e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2\": container with ID starting with e354b34f1e494b4a9ba063b3cd8a6eb0325b677b3199ac20b1a6bf90d9cee4f2 not found: ID does not exist" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.937795 4697 scope.go:117] "RemoveContainer" containerID="e33f375b0f5f785f5f4cdf94ee23ed63872e404388e18af8aba292e1a3e97490" Feb 20 17:18:02 crc kubenswrapper[4697]: E0220 17:18:02.938699 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33f375b0f5f785f5f4cdf94ee23ed63872e404388e18af8aba292e1a3e97490\": container with ID starting with e33f375b0f5f785f5f4cdf94ee23ed63872e404388e18af8aba292e1a3e97490 not found: ID does not exist" containerID="e33f375b0f5f785f5f4cdf94ee23ed63872e404388e18af8aba292e1a3e97490" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.938733 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33f375b0f5f785f5f4cdf94ee23ed63872e404388e18af8aba292e1a3e97490"} err="failed to get container status \"e33f375b0f5f785f5f4cdf94ee23ed63872e404388e18af8aba292e1a3e97490\": rpc error: code = NotFound desc = could not find container \"e33f375b0f5f785f5f4cdf94ee23ed63872e404388e18af8aba292e1a3e97490\": container with ID starting with e33f375b0f5f785f5f4cdf94ee23ed63872e404388e18af8aba292e1a3e97490 not found: ID does not exist" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.938785 4697 scope.go:117] "RemoveContainer" containerID="041b538c724428269e8af68fc4d294699c826200d03736799d270c8cb58f5de5" Feb 20 17:18:02 crc kubenswrapper[4697]: E0220 17:18:02.939098 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"041b538c724428269e8af68fc4d294699c826200d03736799d270c8cb58f5de5\": container with ID starting with 041b538c724428269e8af68fc4d294699c826200d03736799d270c8cb58f5de5 not found: ID does not exist" containerID="041b538c724428269e8af68fc4d294699c826200d03736799d270c8cb58f5de5" Feb 20 17:18:02 crc kubenswrapper[4697]: I0220 17:18:02.939126 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041b538c724428269e8af68fc4d294699c826200d03736799d270c8cb58f5de5"} err="failed to get container status \"041b538c724428269e8af68fc4d294699c826200d03736799d270c8cb58f5de5\": rpc error: code = NotFound desc = could not find container \"041b538c724428269e8af68fc4d294699c826200d03736799d270c8cb58f5de5\": container with ID starting with 041b538c724428269e8af68fc4d294699c826200d03736799d270c8cb58f5de5 not found: ID does not exist" Feb 20 17:18:03 crc kubenswrapper[4697]: I0220 17:18:03.105184 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pbkfz"] Feb 20 17:18:03 crc kubenswrapper[4697]: I0220 17:18:03.119579 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pbkfz"] Feb 20 17:18:04 crc kubenswrapper[4697]: I0220 17:18:04.470181 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 17:18:04 crc kubenswrapper[4697]: I0220 17:18:04.471095 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="prometheus" containerID="cri-o://6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9" gracePeriod=600 Feb 20 17:18:04 crc kubenswrapper[4697]: I0220 17:18:04.471226 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="thanos-sidecar" containerID="cri-o://7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8" gracePeriod=600 Feb 20 17:18:04 crc kubenswrapper[4697]: I0220 17:18:04.476535 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="config-reloader" containerID="cri-o://818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4" gracePeriod=600 Feb 20 17:18:04 crc kubenswrapper[4697]: I0220 17:18:04.791138 4697 generic.go:334] "Generic (PLEG): container finished" podID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerID="7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8" exitCode=0 Feb 20 17:18:04 crc kubenswrapper[4697]: I0220 17:18:04.791391 4697 generic.go:334] "Generic (PLEG): container finished" podID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerID="6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9" exitCode=0 Feb 20 17:18:04 crc kubenswrapper[4697]: I0220 17:18:04.791232 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23464f44-ddc9-4b6e-8e53-6196d0136cc0","Type":"ContainerDied","Data":"7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8"} Feb 20 17:18:04 crc kubenswrapper[4697]: I0220 17:18:04.791445 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23464f44-ddc9-4b6e-8e53-6196d0136cc0","Type":"ContainerDied","Data":"6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9"} Feb 20 17:18:04 crc kubenswrapper[4697]: I0220 17:18:04.889037 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e64a199d-6d38-43d4-bf8c-b28e9601d0bd" path="/var/lib/kubelet/pods/e64a199d-6d38-43d4-bf8c-b28e9601d0bd/volumes" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.672253 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.790342 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23464f44-ddc9-4b6e-8e53-6196d0136cc0-config-out\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.790725 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-secret-combined-ca-bundle\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.790997 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.791069 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-0\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.791672 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.791715 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23464f44-ddc9-4b6e-8e53-6196d0136cc0-tls-assets\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.791741 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.791793 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-2\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.791808 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-thanos-prometheus-http-client-file\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.791854 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-config\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.791899 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9vv9\" (UniqueName: \"kubernetes.io/projected/23464f44-ddc9-4b6e-8e53-6196d0136cc0-kube-api-access-h9vv9\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.791963 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.791996 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-1\") pod \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\" (UID: \"23464f44-ddc9-4b6e-8e53-6196d0136cc0\") " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.792399 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.794393 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.795097 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.801225 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-config" (OuterVolumeSpecName: "config") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.801748 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.803404 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.803457 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23464f44-ddc9-4b6e-8e53-6196d0136cc0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.803946 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.806010 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23464f44-ddc9-4b6e-8e53-6196d0136cc0-config-out" (OuterVolumeSpecName: "config-out") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.806247 4697 generic.go:334] "Generic (PLEG): container finished" podID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerID="818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4" exitCode=0 Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.806346 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23464f44-ddc9-4b6e-8e53-6196d0136cc0","Type":"ContainerDied","Data":"818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4"} Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.806426 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23464f44-ddc9-4b6e-8e53-6196d0136cc0","Type":"ContainerDied","Data":"6c9e20c7fcb3e218e91783aeb1fc75df8aeab9083f9bae04ed9074507666312d"} Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.806527 4697 scope.go:117] "RemoveContainer" containerID="7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.806694 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.813895 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23464f44-ddc9-4b6e-8e53-6196d0136cc0-kube-api-access-h9vv9" (OuterVolumeSpecName: "kube-api-access-h9vv9") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "kube-api-access-h9vv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.825516 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.835163 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.896602 4697 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23464f44-ddc9-4b6e-8e53-6196d0136cc0-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.896680 4697 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.896714 4697 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.896745 4697 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.896772 4697 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-config\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.896797 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9vv9\" (UniqueName: \"kubernetes.io/projected/23464f44-ddc9-4b6e-8e53-6196d0136cc0-kube-api-access-h9vv9\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.896821 4697 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.896846 4697 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23464f44-ddc9-4b6e-8e53-6196d0136cc0-config-out\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.896875 4697 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.896899 4697 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.896924 4697 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/23464f44-ddc9-4b6e-8e53-6196d0136cc0-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.896997 4697 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") on node \"crc\" " Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.931603 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config" (OuterVolumeSpecName: "web-config") pod "23464f44-ddc9-4b6e-8e53-6196d0136cc0" (UID: "23464f44-ddc9-4b6e-8e53-6196d0136cc0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.943676 4697 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.944977 4697 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0") on node "crc" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.999566 4697 reconciler_common.go:293] "Volume detached for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:05 crc kubenswrapper[4697]: I0220 17:18:05.999933 4697 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23464f44-ddc9-4b6e-8e53-6196d0136cc0-web-config\") on node \"crc\" DevicePath \"\"" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.047737 4697 scope.go:117] "RemoveContainer" containerID="818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.070627 4697 scope.go:117] "RemoveContainer" containerID="6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.092145 4697 scope.go:117] "RemoveContainer" containerID="17f802dae38b06b5b55a2659fd54e20ccd1b40172d6d4e186b3bd93ec13cba6d" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.115898 4697 scope.go:117] "RemoveContainer" containerID="7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8" Feb 20 17:18:06 crc kubenswrapper[4697]: E0220 17:18:06.117202 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8\": container with ID starting with 7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8 not found: ID does not exist" containerID="7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.117246 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8"} err="failed to get container status \"7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8\": rpc error: code = NotFound desc = could not find container \"7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8\": container with ID starting with 7d815eff7108e752739e282e0e1ccdefee27e9b94fd5b3367b966421e9b78ab8 not found: ID does not exist" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.117288 4697 scope.go:117] "RemoveContainer" containerID="818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4" Feb 20 17:18:06 crc kubenswrapper[4697]: E0220 17:18:06.121489 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4\": container with ID starting with 818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4 not found: ID does not exist" containerID="818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.121543 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4"} err="failed to get container status \"818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4\": rpc error: code = NotFound desc = could not find container \"818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4\": container with ID starting with 818caa9650215d3a8bda97485e29960a46b5230e22ae61cd3973edf29e4ab5a4 not found: ID does not exist" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.121569 4697 scope.go:117] "RemoveContainer" containerID="6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9" Feb 20 17:18:06 crc kubenswrapper[4697]: E0220 17:18:06.122016 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9\": container with ID starting with 6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9 not found: ID does not exist" containerID="6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.122040 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9"} err="failed to get container status \"6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9\": rpc error: code = NotFound desc = could not find container \"6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9\": container with ID starting with 6c860bd731e7d4644d3f20bf904e8e1da71052fcb37ad36e552c8d1868f4aac9 not found: ID does not exist" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.122053 4697 scope.go:117] "RemoveContainer" containerID="17f802dae38b06b5b55a2659fd54e20ccd1b40172d6d4e186b3bd93ec13cba6d" Feb 20 17:18:06 crc kubenswrapper[4697]: E0220 17:18:06.122350 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f802dae38b06b5b55a2659fd54e20ccd1b40172d6d4e186b3bd93ec13cba6d\": container with ID starting with 17f802dae38b06b5b55a2659fd54e20ccd1b40172d6d4e186b3bd93ec13cba6d not found: ID does not exist" containerID="17f802dae38b06b5b55a2659fd54e20ccd1b40172d6d4e186b3bd93ec13cba6d" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.122406 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f802dae38b06b5b55a2659fd54e20ccd1b40172d6d4e186b3bd93ec13cba6d"} err="failed to get container status \"17f802dae38b06b5b55a2659fd54e20ccd1b40172d6d4e186b3bd93ec13cba6d\": rpc error: code = NotFound desc = could not find container \"17f802dae38b06b5b55a2659fd54e20ccd1b40172d6d4e186b3bd93ec13cba6d\": container with ID starting with 17f802dae38b06b5b55a2659fd54e20ccd1b40172d6d4e186b3bd93ec13cba6d not found: ID does not exist" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.143448 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.153363 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.171974 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 17:18:06 crc kubenswrapper[4697]: E0220 17:18:06.172383 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="thanos-sidecar" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.172402 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="thanos-sidecar" Feb 20 17:18:06 crc kubenswrapper[4697]: E0220 17:18:06.172418 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64a199d-6d38-43d4-bf8c-b28e9601d0bd" containerName="extract-utilities" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.172425 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64a199d-6d38-43d4-bf8c-b28e9601d0bd" containerName="extract-utilities" Feb 20 17:18:06 crc kubenswrapper[4697]: E0220 17:18:06.172454 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="prometheus" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.172460 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="prometheus" Feb 20 17:18:06 crc kubenswrapper[4697]: E0220 17:18:06.172481 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64a199d-6d38-43d4-bf8c-b28e9601d0bd" containerName="registry-server" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.172487 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64a199d-6d38-43d4-bf8c-b28e9601d0bd" containerName="registry-server" Feb 20 17:18:06 crc kubenswrapper[4697]: E0220 17:18:06.172514 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="init-config-reloader" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.172520 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="init-config-reloader" Feb 20 17:18:06 crc kubenswrapper[4697]: E0220 17:18:06.172532 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="config-reloader" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.172537 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="config-reloader" Feb 20 17:18:06 crc kubenswrapper[4697]: E0220 17:18:06.172550 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64a199d-6d38-43d4-bf8c-b28e9601d0bd" containerName="extract-content" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.172556 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64a199d-6d38-43d4-bf8c-b28e9601d0bd" containerName="extract-content" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.172721 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="prometheus" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.172737 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="config-reloader" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.172752 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" containerName="thanos-sidecar" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.172764 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64a199d-6d38-43d4-bf8c-b28e9601d0bd" containerName="registry-server" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.174449 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.176625 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.177963 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.179711 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.179788 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.180207 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.180506 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-tnj5w" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.186798 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.202372 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.252263 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.310872 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.310938 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24h8l\" (UniqueName: \"kubernetes.io/projected/2cb601d7-bad1-4085-9519-4cb9927fa531-kube-api-access-24h8l\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.311181 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2cb601d7-bad1-4085-9519-4cb9927fa531-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.311303 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2cb601d7-bad1-4085-9519-4cb9927fa531-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.311474 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.311572 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.311647 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.311741 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/2cb601d7-bad1-4085-9519-4cb9927fa531-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.311823 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/2cb601d7-bad1-4085-9519-4cb9927fa531-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.311875 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.311908 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-config\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.311945 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2cb601d7-bad1-4085-9519-4cb9927fa531-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.312014 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414496 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2cb601d7-bad1-4085-9519-4cb9927fa531-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414564 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414593 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414610 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24h8l\" (UniqueName: \"kubernetes.io/projected/2cb601d7-bad1-4085-9519-4cb9927fa531-kube-api-access-24h8l\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414674 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2cb601d7-bad1-4085-9519-4cb9927fa531-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414709 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2cb601d7-bad1-4085-9519-4cb9927fa531-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414759 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414779 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414799 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414819 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/2cb601d7-bad1-4085-9519-4cb9927fa531-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414861 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/2cb601d7-bad1-4085-9519-4cb9927fa531-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414884 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.414903 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-config\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.416265 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/2cb601d7-bad1-4085-9519-4cb9927fa531-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.416841 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2cb601d7-bad1-4085-9519-4cb9927fa531-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.417366 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/2cb601d7-bad1-4085-9519-4cb9927fa531-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.419696 4697 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.419738 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ab999197bb553d62566307ca20e48f871152403b8e1c643d4e6f778eae279956/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.420251 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.421058 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.421359 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.422373 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-config\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.422411 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.422559 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2cb601d7-bad1-4085-9519-4cb9927fa531-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.423606 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2cb601d7-bad1-4085-9519-4cb9927fa531-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.427073 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2cb601d7-bad1-4085-9519-4cb9927fa531-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.434537 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24h8l\" (UniqueName: \"kubernetes.io/projected/2cb601d7-bad1-4085-9519-4cb9927fa531-kube-api-access-24h8l\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.463950 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-09590b87-d57a-4767-b49e-877bb7ed0fa0\") pod \"prometheus-metric-storage-0\" (UID: \"2cb601d7-bad1-4085-9519-4cb9927fa531\") " pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.513803 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:06 crc kubenswrapper[4697]: I0220 17:18:06.892503 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23464f44-ddc9-4b6e-8e53-6196d0136cc0" path="/var/lib/kubelet/pods/23464f44-ddc9-4b6e-8e53-6196d0136cc0/volumes" Feb 20 17:18:07 crc kubenswrapper[4697]: I0220 17:18:07.039378 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 17:18:07 crc kubenswrapper[4697]: I0220 17:18:07.825010 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cb601d7-bad1-4085-9519-4cb9927fa531","Type":"ContainerStarted","Data":"8010c3584d5eb0d31b7af9b13992a96d13beb648db747f5e16419e14f52008f1"} Feb 20 17:18:10 crc kubenswrapper[4697]: I0220 17:18:10.856653 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cb601d7-bad1-4085-9519-4cb9927fa531","Type":"ContainerStarted","Data":"7bae49e5a562a5e829ece0ca83465c0413998d208465cf6d5adb486c5af9bb4d"} Feb 20 17:18:17 crc kubenswrapper[4697]: I0220 17:18:17.995743 4697 generic.go:334] "Generic (PLEG): container finished" podID="2cb601d7-bad1-4085-9519-4cb9927fa531" containerID="7bae49e5a562a5e829ece0ca83465c0413998d208465cf6d5adb486c5af9bb4d" exitCode=0 Feb 20 17:18:17 crc kubenswrapper[4697]: I0220 17:18:17.995883 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cb601d7-bad1-4085-9519-4cb9927fa531","Type":"ContainerDied","Data":"7bae49e5a562a5e829ece0ca83465c0413998d208465cf6d5adb486c5af9bb4d"} Feb 20 17:18:19 crc kubenswrapper[4697]: I0220 17:18:19.011204 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cb601d7-bad1-4085-9519-4cb9927fa531","Type":"ContainerStarted","Data":"453fd1c88df25cacf88f31f98a1853023cfeefcca7e7988eba8538a4022cfd38"} Feb 20 17:18:22 crc kubenswrapper[4697]: I0220 17:18:22.041966 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cb601d7-bad1-4085-9519-4cb9927fa531","Type":"ContainerStarted","Data":"e4b06a34a0f559eb7ae388b6c945ce260c693b3b4af2147a46a870c1b90d1a81"} Feb 20 17:18:23 crc kubenswrapper[4697]: I0220 17:18:23.054199 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cb601d7-bad1-4085-9519-4cb9927fa531","Type":"ContainerStarted","Data":"24a869760a8a0b79d07dbd5776b62899847ec58a473ae80b733261d5814d445b"} Feb 20 17:18:23 crc kubenswrapper[4697]: I0220 17:18:23.081773 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.08175296 podStartE2EDuration="17.08175296s" podCreationTimestamp="2026-02-20 17:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 17:18:23.07725764 +0000 UTC m=+2810.857303068" watchObservedRunningTime="2026-02-20 17:18:23.08175296 +0000 UTC m=+2810.861798368" Feb 20 17:18:26 crc kubenswrapper[4697]: I0220 17:18:26.514851 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:31 crc kubenswrapper[4697]: I0220 17:18:31.185104 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:18:31 crc kubenswrapper[4697]: I0220 17:18:31.185585 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:18:36 crc kubenswrapper[4697]: I0220 17:18:36.514899 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:36 crc kubenswrapper[4697]: I0220 17:18:36.521057 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:37 crc kubenswrapper[4697]: I0220 17:18:37.202651 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.524531 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.527224 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.530257 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.532228 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-n2vrd" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.532795 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.536763 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.548272 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.670468 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.670549 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.670594 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e83acb01-3a91-4950-848f-d447679c0533-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.670669 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e83acb01-3a91-4950-848f-d447679c0533-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.670727 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxswp\" (UniqueName: \"kubernetes.io/projected/e83acb01-3a91-4950-848f-d447679c0533-kube-api-access-cxswp\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.670754 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.670809 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e83acb01-3a91-4950-848f-d447679c0533-config-data\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.670836 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.670964 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e83acb01-3a91-4950-848f-d447679c0533-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.773276 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.773361 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.773418 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e83acb01-3a91-4950-848f-d447679c0533-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.773537 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e83acb01-3a91-4950-848f-d447679c0533-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.773613 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.773647 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxswp\" (UniqueName: \"kubernetes.io/projected/e83acb01-3a91-4950-848f-d447679c0533-kube-api-access-cxswp\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.773687 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e83acb01-3a91-4950-848f-d447679c0533-config-data\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.773716 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.773752 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e83acb01-3a91-4950-848f-d447679c0533-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.774155 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.774387 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e83acb01-3a91-4950-848f-d447679c0533-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.774509 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e83acb01-3a91-4950-848f-d447679c0533-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.775834 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e83acb01-3a91-4950-848f-d447679c0533-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.776424 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e83acb01-3a91-4950-848f-d447679c0533-config-data\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.783154 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.784864 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.791390 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.796935 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxswp\" (UniqueName: \"kubernetes.io/projected/e83acb01-3a91-4950-848f-d447679c0533-kube-api-access-cxswp\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.814582 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " pod="openstack/tempest-tests-tempest" Feb 20 17:18:55 crc kubenswrapper[4697]: I0220 17:18:55.870898 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 20 17:18:56 crc kubenswrapper[4697]: I0220 17:18:56.481697 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 20 17:18:57 crc kubenswrapper[4697]: I0220 17:18:57.389665 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e83acb01-3a91-4950-848f-d447679c0533","Type":"ContainerStarted","Data":"53204102fa0927784a76063a0c24718be54c80f47e4dc4059aea2842199eca8d"} Feb 20 17:19:01 crc kubenswrapper[4697]: I0220 17:19:01.185258 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:19:01 crc kubenswrapper[4697]: I0220 17:19:01.185855 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:19:07 crc kubenswrapper[4697]: I0220 17:19:07.493804 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e83acb01-3a91-4950-848f-d447679c0533","Type":"ContainerStarted","Data":"cd8359e22280ef738d40191afad5bb4cfcccdc79bf17785a29085c7a3e81b60c"} Feb 20 17:19:07 crc kubenswrapper[4697]: I0220 17:19:07.520489 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.150210374 podStartE2EDuration="13.520470956s" podCreationTimestamp="2026-02-20 17:18:54 +0000 UTC" firstStartedPulling="2026-02-20 17:18:56.489659626 +0000 UTC m=+2844.269705034" lastFinishedPulling="2026-02-20 17:19:05.859920188 +0000 UTC m=+2853.639965616" observedRunningTime="2026-02-20 17:19:07.513110907 +0000 UTC m=+2855.293156315" watchObservedRunningTime="2026-02-20 17:19:07.520470956 +0000 UTC m=+2855.300516364" Feb 20 17:19:31 crc kubenswrapper[4697]: I0220 17:19:31.184967 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:19:31 crc kubenswrapper[4697]: I0220 17:19:31.185701 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:19:31 crc kubenswrapper[4697]: I0220 17:19:31.185776 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 17:19:31 crc kubenswrapper[4697]: I0220 17:19:31.186783 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 17:19:31 crc kubenswrapper[4697]: I0220 17:19:31.186869 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" gracePeriod=600 Feb 20 17:19:31 crc kubenswrapper[4697]: E0220 17:19:31.311882 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:19:31 crc kubenswrapper[4697]: I0220 17:19:31.751044 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" exitCode=0 Feb 20 17:19:31 crc kubenswrapper[4697]: I0220 17:19:31.751272 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87"} Feb 20 17:19:31 crc kubenswrapper[4697]: I0220 17:19:31.751410 4697 scope.go:117] "RemoveContainer" containerID="9d496f5c2a2fdecd518eeefb67465f7013c83166b86974cefe0c57b35aaead31" Feb 20 17:19:31 crc kubenswrapper[4697]: I0220 17:19:31.752170 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:19:31 crc kubenswrapper[4697]: E0220 17:19:31.752494 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:19:42 crc kubenswrapper[4697]: I0220 17:19:42.935063 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:19:42 crc kubenswrapper[4697]: E0220 17:19:42.937385 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:19:46 crc kubenswrapper[4697]: I0220 17:19:46.333063 4697 scope.go:117] "RemoveContainer" containerID="8024cdfbf71b3e15cc6fda3357fbd7a462a25d62ee07e4313438858153a96149" Feb 20 17:19:46 crc kubenswrapper[4697]: I0220 17:19:46.358201 4697 scope.go:117] "RemoveContainer" containerID="209206eb002669894a1e4ffb54804c09a049062a8b4a440028b1b27da4af4fa5" Feb 20 17:19:46 crc kubenswrapper[4697]: I0220 17:19:46.438124 4697 scope.go:117] "RemoveContainer" containerID="e273caebf6e582dea605c05bb81c4e370aa0fc15da86e2e607a8923356e68361" Feb 20 17:19:55 crc kubenswrapper[4697]: I0220 17:19:55.876857 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:19:55 crc kubenswrapper[4697]: E0220 17:19:55.877624 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:20:08 crc kubenswrapper[4697]: I0220 17:20:08.877811 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:20:08 crc kubenswrapper[4697]: E0220 17:20:08.878933 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:20:22 crc kubenswrapper[4697]: I0220 17:20:22.884986 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:20:22 crc kubenswrapper[4697]: E0220 17:20:22.886553 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:20:33 crc kubenswrapper[4697]: I0220 17:20:33.878804 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:20:33 crc kubenswrapper[4697]: E0220 17:20:33.882808 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:20:44 crc kubenswrapper[4697]: I0220 17:20:44.877453 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:20:44 crc kubenswrapper[4697]: E0220 17:20:44.878469 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:20:58 crc kubenswrapper[4697]: I0220 17:20:58.878361 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:20:58 crc kubenswrapper[4697]: E0220 17:20:58.879093 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:21:10 crc kubenswrapper[4697]: I0220 17:21:10.877202 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:21:10 crc kubenswrapper[4697]: E0220 17:21:10.878126 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:21:25 crc kubenswrapper[4697]: I0220 17:21:25.877817 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:21:25 crc kubenswrapper[4697]: E0220 17:21:25.878577 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:21:36 crc kubenswrapper[4697]: I0220 17:21:36.878351 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:21:36 crc kubenswrapper[4697]: E0220 17:21:36.879675 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:21:51 crc kubenswrapper[4697]: I0220 17:21:51.877195 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:21:51 crc kubenswrapper[4697]: E0220 17:21:51.878549 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:22:05 crc kubenswrapper[4697]: I0220 17:22:05.876728 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:22:05 crc kubenswrapper[4697]: E0220 17:22:05.877674 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:22:19 crc kubenswrapper[4697]: I0220 17:22:19.877792 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:22:19 crc kubenswrapper[4697]: E0220 17:22:19.879821 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:22:31 crc kubenswrapper[4697]: I0220 17:22:31.877780 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:22:31 crc kubenswrapper[4697]: E0220 17:22:31.878603 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:22:32 crc kubenswrapper[4697]: I0220 17:22:32.911053 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pb4l6"] Feb 20 17:22:32 crc kubenswrapper[4697]: I0220 17:22:32.915037 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:32 crc kubenswrapper[4697]: I0220 17:22:32.930091 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pb4l6"] Feb 20 17:22:32 crc kubenswrapper[4697]: I0220 17:22:32.988893 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e5518fc-f4d3-466d-955c-393c7d2e4753-utilities\") pod \"redhat-marketplace-pb4l6\" (UID: \"2e5518fc-f4d3-466d-955c-393c7d2e4753\") " pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:32 crc kubenswrapper[4697]: I0220 17:22:32.988957 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2lld\" (UniqueName: \"kubernetes.io/projected/2e5518fc-f4d3-466d-955c-393c7d2e4753-kube-api-access-n2lld\") pod \"redhat-marketplace-pb4l6\" (UID: \"2e5518fc-f4d3-466d-955c-393c7d2e4753\") " pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:32 crc kubenswrapper[4697]: I0220 17:22:32.989105 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e5518fc-f4d3-466d-955c-393c7d2e4753-catalog-content\") pod \"redhat-marketplace-pb4l6\" (UID: \"2e5518fc-f4d3-466d-955c-393c7d2e4753\") " pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:33 crc kubenswrapper[4697]: I0220 17:22:33.091058 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e5518fc-f4d3-466d-955c-393c7d2e4753-catalog-content\") pod \"redhat-marketplace-pb4l6\" (UID: \"2e5518fc-f4d3-466d-955c-393c7d2e4753\") " pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:33 crc kubenswrapper[4697]: I0220 17:22:33.091235 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e5518fc-f4d3-466d-955c-393c7d2e4753-utilities\") pod \"redhat-marketplace-pb4l6\" (UID: \"2e5518fc-f4d3-466d-955c-393c7d2e4753\") " pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:33 crc kubenswrapper[4697]: I0220 17:22:33.091268 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2lld\" (UniqueName: \"kubernetes.io/projected/2e5518fc-f4d3-466d-955c-393c7d2e4753-kube-api-access-n2lld\") pod \"redhat-marketplace-pb4l6\" (UID: \"2e5518fc-f4d3-466d-955c-393c7d2e4753\") " pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:33 crc kubenswrapper[4697]: I0220 17:22:33.091598 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e5518fc-f4d3-466d-955c-393c7d2e4753-catalog-content\") pod \"redhat-marketplace-pb4l6\" (UID: \"2e5518fc-f4d3-466d-955c-393c7d2e4753\") " pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:33 crc kubenswrapper[4697]: I0220 17:22:33.091856 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e5518fc-f4d3-466d-955c-393c7d2e4753-utilities\") pod \"redhat-marketplace-pb4l6\" (UID: \"2e5518fc-f4d3-466d-955c-393c7d2e4753\") " pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:33 crc kubenswrapper[4697]: I0220 17:22:33.116399 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2lld\" (UniqueName: \"kubernetes.io/projected/2e5518fc-f4d3-466d-955c-393c7d2e4753-kube-api-access-n2lld\") pod \"redhat-marketplace-pb4l6\" (UID: \"2e5518fc-f4d3-466d-955c-393c7d2e4753\") " pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:33 crc kubenswrapper[4697]: I0220 17:22:33.249223 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:33 crc kubenswrapper[4697]: I0220 17:22:33.865510 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pb4l6"] Feb 20 17:22:33 crc kubenswrapper[4697]: W0220 17:22:33.873690 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e5518fc_f4d3_466d_955c_393c7d2e4753.slice/crio-71ddfa2969736fda6d4722ac25efa252ff19d8651504c6a7c0436ce90da097ee WatchSource:0}: Error finding container 71ddfa2969736fda6d4722ac25efa252ff19d8651504c6a7c0436ce90da097ee: Status 404 returned error can't find the container with id 71ddfa2969736fda6d4722ac25efa252ff19d8651504c6a7c0436ce90da097ee Feb 20 17:22:34 crc kubenswrapper[4697]: I0220 17:22:34.722712 4697 generic.go:334] "Generic (PLEG): container finished" podID="2e5518fc-f4d3-466d-955c-393c7d2e4753" containerID="596c3bb1d1243b997eaee38f18e56af9ff1e94142cdcc7016edea9d4b91e5f07" exitCode=0 Feb 20 17:22:34 crc kubenswrapper[4697]: I0220 17:22:34.722819 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb4l6" event={"ID":"2e5518fc-f4d3-466d-955c-393c7d2e4753","Type":"ContainerDied","Data":"596c3bb1d1243b997eaee38f18e56af9ff1e94142cdcc7016edea9d4b91e5f07"} Feb 20 17:22:34 crc kubenswrapper[4697]: I0220 17:22:34.723044 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb4l6" event={"ID":"2e5518fc-f4d3-466d-955c-393c7d2e4753","Type":"ContainerStarted","Data":"71ddfa2969736fda6d4722ac25efa252ff19d8651504c6a7c0436ce90da097ee"} Feb 20 17:22:34 crc kubenswrapper[4697]: I0220 17:22:34.724753 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 17:22:35 crc kubenswrapper[4697]: I0220 17:22:35.749189 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb4l6" event={"ID":"2e5518fc-f4d3-466d-955c-393c7d2e4753","Type":"ContainerStarted","Data":"4d0870af2b8582a606d5c414c97a6c0afccf953bf73413099691759d8c396b5c"} Feb 20 17:22:36 crc kubenswrapper[4697]: I0220 17:22:36.763108 4697 generic.go:334] "Generic (PLEG): container finished" podID="2e5518fc-f4d3-466d-955c-393c7d2e4753" containerID="4d0870af2b8582a606d5c414c97a6c0afccf953bf73413099691759d8c396b5c" exitCode=0 Feb 20 17:22:36 crc kubenswrapper[4697]: I0220 17:22:36.763151 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb4l6" event={"ID":"2e5518fc-f4d3-466d-955c-393c7d2e4753","Type":"ContainerDied","Data":"4d0870af2b8582a606d5c414c97a6c0afccf953bf73413099691759d8c396b5c"} Feb 20 17:22:37 crc kubenswrapper[4697]: I0220 17:22:37.777154 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb4l6" event={"ID":"2e5518fc-f4d3-466d-955c-393c7d2e4753","Type":"ContainerStarted","Data":"6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e"} Feb 20 17:22:43 crc kubenswrapper[4697]: I0220 17:22:43.249835 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:43 crc kubenswrapper[4697]: I0220 17:22:43.250811 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:43 crc kubenswrapper[4697]: I0220 17:22:43.308114 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:43 crc kubenswrapper[4697]: I0220 17:22:43.337772 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pb4l6" podStartSLOduration=8.90835806 podStartE2EDuration="11.337728125s" podCreationTimestamp="2026-02-20 17:22:32 +0000 UTC" firstStartedPulling="2026-02-20 17:22:34.724501302 +0000 UTC m=+3062.504546710" lastFinishedPulling="2026-02-20 17:22:37.153871337 +0000 UTC m=+3064.933916775" observedRunningTime="2026-02-20 17:22:37.807499279 +0000 UTC m=+3065.587544697" watchObservedRunningTime="2026-02-20 17:22:43.337728125 +0000 UTC m=+3071.117773543" Feb 20 17:22:43 crc kubenswrapper[4697]: I0220 17:22:43.900566 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:43 crc kubenswrapper[4697]: I0220 17:22:43.956716 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pb4l6"] Feb 20 17:22:45 crc kubenswrapper[4697]: I0220 17:22:45.862848 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pb4l6" podUID="2e5518fc-f4d3-466d-955c-393c7d2e4753" containerName="registry-server" containerID="cri-o://6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e" gracePeriod=2 Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.418517 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.526079 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e5518fc-f4d3-466d-955c-393c7d2e4753-catalog-content\") pod \"2e5518fc-f4d3-466d-955c-393c7d2e4753\" (UID: \"2e5518fc-f4d3-466d-955c-393c7d2e4753\") " Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.526177 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2lld\" (UniqueName: \"kubernetes.io/projected/2e5518fc-f4d3-466d-955c-393c7d2e4753-kube-api-access-n2lld\") pod \"2e5518fc-f4d3-466d-955c-393c7d2e4753\" (UID: \"2e5518fc-f4d3-466d-955c-393c7d2e4753\") " Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.526344 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e5518fc-f4d3-466d-955c-393c7d2e4753-utilities\") pod \"2e5518fc-f4d3-466d-955c-393c7d2e4753\" (UID: \"2e5518fc-f4d3-466d-955c-393c7d2e4753\") " Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.527547 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e5518fc-f4d3-466d-955c-393c7d2e4753-utilities" (OuterVolumeSpecName: "utilities") pod "2e5518fc-f4d3-466d-955c-393c7d2e4753" (UID: "2e5518fc-f4d3-466d-955c-393c7d2e4753"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.528401 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e5518fc-f4d3-466d-955c-393c7d2e4753-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.538197 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e5518fc-f4d3-466d-955c-393c7d2e4753-kube-api-access-n2lld" (OuterVolumeSpecName: "kube-api-access-n2lld") pod "2e5518fc-f4d3-466d-955c-393c7d2e4753" (UID: "2e5518fc-f4d3-466d-955c-393c7d2e4753"). InnerVolumeSpecName "kube-api-access-n2lld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.570901 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e5518fc-f4d3-466d-955c-393c7d2e4753-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e5518fc-f4d3-466d-955c-393c7d2e4753" (UID: "2e5518fc-f4d3-466d-955c-393c7d2e4753"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.630138 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e5518fc-f4d3-466d-955c-393c7d2e4753-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.630472 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2lld\" (UniqueName: \"kubernetes.io/projected/2e5518fc-f4d3-466d-955c-393c7d2e4753-kube-api-access-n2lld\") on node \"crc\" DevicePath \"\"" Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.877063 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:22:46 crc kubenswrapper[4697]: E0220 17:22:46.877537 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.886950 4697 generic.go:334] "Generic (PLEG): container finished" podID="2e5518fc-f4d3-466d-955c-393c7d2e4753" containerID="6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e" exitCode=0 Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.887078 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pb4l6" Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.893530 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb4l6" event={"ID":"2e5518fc-f4d3-466d-955c-393c7d2e4753","Type":"ContainerDied","Data":"6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e"} Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.896600 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pb4l6" event={"ID":"2e5518fc-f4d3-466d-955c-393c7d2e4753","Type":"ContainerDied","Data":"71ddfa2969736fda6d4722ac25efa252ff19d8651504c6a7c0436ce90da097ee"} Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.896637 4697 scope.go:117] "RemoveContainer" containerID="6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e" Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.924851 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pb4l6"] Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.932269 4697 scope.go:117] "RemoveContainer" containerID="4d0870af2b8582a606d5c414c97a6c0afccf953bf73413099691759d8c396b5c" Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.936007 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pb4l6"] Feb 20 17:22:46 crc kubenswrapper[4697]: I0220 17:22:46.956721 4697 scope.go:117] "RemoveContainer" containerID="596c3bb1d1243b997eaee38f18e56af9ff1e94142cdcc7016edea9d4b91e5f07" Feb 20 17:22:47 crc kubenswrapper[4697]: I0220 17:22:47.000026 4697 scope.go:117] "RemoveContainer" containerID="6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e" Feb 20 17:22:47 crc kubenswrapper[4697]: E0220 17:22:47.000645 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e\": container with ID starting with 6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e not found: ID does not exist" containerID="6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e" Feb 20 17:22:47 crc kubenswrapper[4697]: I0220 17:22:47.000692 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e"} err="failed to get container status \"6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e\": rpc error: code = NotFound desc = could not find container \"6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e\": container with ID starting with 6523a4cdbbf852edc4983e7a958b4a1e379c0f6010ac737bc4903c4a88e0d63e not found: ID does not exist" Feb 20 17:22:47 crc kubenswrapper[4697]: I0220 17:22:47.000720 4697 scope.go:117] "RemoveContainer" containerID="4d0870af2b8582a606d5c414c97a6c0afccf953bf73413099691759d8c396b5c" Feb 20 17:22:47 crc kubenswrapper[4697]: E0220 17:22:47.001235 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0870af2b8582a606d5c414c97a6c0afccf953bf73413099691759d8c396b5c\": container with ID starting with 4d0870af2b8582a606d5c414c97a6c0afccf953bf73413099691759d8c396b5c not found: ID does not exist" containerID="4d0870af2b8582a606d5c414c97a6c0afccf953bf73413099691759d8c396b5c" Feb 20 17:22:47 crc kubenswrapper[4697]: I0220 17:22:47.001266 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0870af2b8582a606d5c414c97a6c0afccf953bf73413099691759d8c396b5c"} err="failed to get container status \"4d0870af2b8582a606d5c414c97a6c0afccf953bf73413099691759d8c396b5c\": rpc error: code = NotFound desc = could not find container \"4d0870af2b8582a606d5c414c97a6c0afccf953bf73413099691759d8c396b5c\": container with ID starting with 4d0870af2b8582a606d5c414c97a6c0afccf953bf73413099691759d8c396b5c not found: ID does not exist" Feb 20 17:22:47 crc kubenswrapper[4697]: I0220 17:22:47.001279 4697 scope.go:117] "RemoveContainer" containerID="596c3bb1d1243b997eaee38f18e56af9ff1e94142cdcc7016edea9d4b91e5f07" Feb 20 17:22:47 crc kubenswrapper[4697]: E0220 17:22:47.001677 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596c3bb1d1243b997eaee38f18e56af9ff1e94142cdcc7016edea9d4b91e5f07\": container with ID starting with 596c3bb1d1243b997eaee38f18e56af9ff1e94142cdcc7016edea9d4b91e5f07 not found: ID does not exist" containerID="596c3bb1d1243b997eaee38f18e56af9ff1e94142cdcc7016edea9d4b91e5f07" Feb 20 17:22:47 crc kubenswrapper[4697]: I0220 17:22:47.001905 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596c3bb1d1243b997eaee38f18e56af9ff1e94142cdcc7016edea9d4b91e5f07"} err="failed to get container status \"596c3bb1d1243b997eaee38f18e56af9ff1e94142cdcc7016edea9d4b91e5f07\": rpc error: code = NotFound desc = could not find container \"596c3bb1d1243b997eaee38f18e56af9ff1e94142cdcc7016edea9d4b91e5f07\": container with ID starting with 596c3bb1d1243b997eaee38f18e56af9ff1e94142cdcc7016edea9d4b91e5f07 not found: ID does not exist" Feb 20 17:22:48 crc kubenswrapper[4697]: I0220 17:22:48.893870 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e5518fc-f4d3-466d-955c-393c7d2e4753" path="/var/lib/kubelet/pods/2e5518fc-f4d3-466d-955c-393c7d2e4753/volumes" Feb 20 17:22:57 crc kubenswrapper[4697]: I0220 17:22:57.877755 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:22:57 crc kubenswrapper[4697]: E0220 17:22:57.878573 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:23:11 crc kubenswrapper[4697]: I0220 17:23:11.877556 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:23:11 crc kubenswrapper[4697]: E0220 17:23:11.878403 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.289828 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kv9qn"] Feb 20 17:23:19 crc kubenswrapper[4697]: E0220 17:23:19.291095 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5518fc-f4d3-466d-955c-393c7d2e4753" containerName="registry-server" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.291115 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5518fc-f4d3-466d-955c-393c7d2e4753" containerName="registry-server" Feb 20 17:23:19 crc kubenswrapper[4697]: E0220 17:23:19.291131 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5518fc-f4d3-466d-955c-393c7d2e4753" containerName="extract-content" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.291138 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5518fc-f4d3-466d-955c-393c7d2e4753" containerName="extract-content" Feb 20 17:23:19 crc kubenswrapper[4697]: E0220 17:23:19.291159 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5518fc-f4d3-466d-955c-393c7d2e4753" containerName="extract-utilities" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.291167 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5518fc-f4d3-466d-955c-393c7d2e4753" containerName="extract-utilities" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.291410 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e5518fc-f4d3-466d-955c-393c7d2e4753" containerName="registry-server" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.293204 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.302154 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kv9qn"] Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.466477 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0cf27b0-a850-4220-995f-e624fed67474-catalog-content\") pod \"certified-operators-kv9qn\" (UID: \"a0cf27b0-a850-4220-995f-e624fed67474\") " pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.466893 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0cf27b0-a850-4220-995f-e624fed67474-utilities\") pod \"certified-operators-kv9qn\" (UID: \"a0cf27b0-a850-4220-995f-e624fed67474\") " pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.466925 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glfmt\" (UniqueName: \"kubernetes.io/projected/a0cf27b0-a850-4220-995f-e624fed67474-kube-api-access-glfmt\") pod \"certified-operators-kv9qn\" (UID: \"a0cf27b0-a850-4220-995f-e624fed67474\") " pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.569018 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glfmt\" (UniqueName: \"kubernetes.io/projected/a0cf27b0-a850-4220-995f-e624fed67474-kube-api-access-glfmt\") pod \"certified-operators-kv9qn\" (UID: \"a0cf27b0-a850-4220-995f-e624fed67474\") " pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.569335 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0cf27b0-a850-4220-995f-e624fed67474-catalog-content\") pod \"certified-operators-kv9qn\" (UID: \"a0cf27b0-a850-4220-995f-e624fed67474\") " pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.569535 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0cf27b0-a850-4220-995f-e624fed67474-utilities\") pod \"certified-operators-kv9qn\" (UID: \"a0cf27b0-a850-4220-995f-e624fed67474\") " pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.569838 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0cf27b0-a850-4220-995f-e624fed67474-catalog-content\") pod \"certified-operators-kv9qn\" (UID: \"a0cf27b0-a850-4220-995f-e624fed67474\") " pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.569849 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0cf27b0-a850-4220-995f-e624fed67474-utilities\") pod \"certified-operators-kv9qn\" (UID: \"a0cf27b0-a850-4220-995f-e624fed67474\") " pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.594325 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glfmt\" (UniqueName: \"kubernetes.io/projected/a0cf27b0-a850-4220-995f-e624fed67474-kube-api-access-glfmt\") pod \"certified-operators-kv9qn\" (UID: \"a0cf27b0-a850-4220-995f-e624fed67474\") " pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:19 crc kubenswrapper[4697]: I0220 17:23:19.654593 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:20 crc kubenswrapper[4697]: I0220 17:23:20.169160 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kv9qn"] Feb 20 17:23:20 crc kubenswrapper[4697]: I0220 17:23:20.198510 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9qn" event={"ID":"a0cf27b0-a850-4220-995f-e624fed67474","Type":"ContainerStarted","Data":"57f1561d573ff65bdb57a54207f24290d5da07f7c648bedcab459c0b7769e725"} Feb 20 17:23:21 crc kubenswrapper[4697]: I0220 17:23:21.207837 4697 generic.go:334] "Generic (PLEG): container finished" podID="a0cf27b0-a850-4220-995f-e624fed67474" containerID="b2aa7afcc6ba6621329cac9f22b4e6f095e0e52af3a180234bebd807b2bb5ea9" exitCode=0 Feb 20 17:23:21 crc kubenswrapper[4697]: I0220 17:23:21.207879 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9qn" event={"ID":"a0cf27b0-a850-4220-995f-e624fed67474","Type":"ContainerDied","Data":"b2aa7afcc6ba6621329cac9f22b4e6f095e0e52af3a180234bebd807b2bb5ea9"} Feb 20 17:23:22 crc kubenswrapper[4697]: I0220 17:23:22.218314 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9qn" event={"ID":"a0cf27b0-a850-4220-995f-e624fed67474","Type":"ContainerStarted","Data":"bc337e597a2885e387ace2a98983d6c72fa945da6f8b618e237d6445f1db5fc5"} Feb 20 17:23:24 crc kubenswrapper[4697]: I0220 17:23:24.238910 4697 generic.go:334] "Generic (PLEG): container finished" podID="a0cf27b0-a850-4220-995f-e624fed67474" containerID="bc337e597a2885e387ace2a98983d6c72fa945da6f8b618e237d6445f1db5fc5" exitCode=0 Feb 20 17:23:24 crc kubenswrapper[4697]: I0220 17:23:24.239054 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9qn" event={"ID":"a0cf27b0-a850-4220-995f-e624fed67474","Type":"ContainerDied","Data":"bc337e597a2885e387ace2a98983d6c72fa945da6f8b618e237d6445f1db5fc5"} Feb 20 17:23:25 crc kubenswrapper[4697]: I0220 17:23:25.250371 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9qn" event={"ID":"a0cf27b0-a850-4220-995f-e624fed67474","Type":"ContainerStarted","Data":"16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7"} Feb 20 17:23:25 crc kubenswrapper[4697]: I0220 17:23:25.283141 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kv9qn" podStartSLOduration=2.864912167 podStartE2EDuration="6.283119214s" podCreationTimestamp="2026-02-20 17:23:19 +0000 UTC" firstStartedPulling="2026-02-20 17:23:21.2094443 +0000 UTC m=+3108.989489708" lastFinishedPulling="2026-02-20 17:23:24.627651347 +0000 UTC m=+3112.407696755" observedRunningTime="2026-02-20 17:23:25.268645211 +0000 UTC m=+3113.048690629" watchObservedRunningTime="2026-02-20 17:23:25.283119214 +0000 UTC m=+3113.063164622" Feb 20 17:23:26 crc kubenswrapper[4697]: I0220 17:23:26.882521 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:23:26 crc kubenswrapper[4697]: E0220 17:23:26.883314 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:23:29 crc kubenswrapper[4697]: I0220 17:23:29.655190 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:29 crc kubenswrapper[4697]: I0220 17:23:29.655677 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:29 crc kubenswrapper[4697]: I0220 17:23:29.727609 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:30 crc kubenswrapper[4697]: I0220 17:23:30.343491 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:30 crc kubenswrapper[4697]: I0220 17:23:30.423726 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kv9qn"] Feb 20 17:23:32 crc kubenswrapper[4697]: I0220 17:23:32.324411 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kv9qn" podUID="a0cf27b0-a850-4220-995f-e624fed67474" containerName="registry-server" containerID="cri-o://16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7" gracePeriod=2 Feb 20 17:23:32 crc kubenswrapper[4697]: I0220 17:23:32.797256 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:32 crc kubenswrapper[4697]: I0220 17:23:32.956103 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0cf27b0-a850-4220-995f-e624fed67474-utilities\") pod \"a0cf27b0-a850-4220-995f-e624fed67474\" (UID: \"a0cf27b0-a850-4220-995f-e624fed67474\") " Feb 20 17:23:32 crc kubenswrapper[4697]: I0220 17:23:32.956889 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glfmt\" (UniqueName: \"kubernetes.io/projected/a0cf27b0-a850-4220-995f-e624fed67474-kube-api-access-glfmt\") pod \"a0cf27b0-a850-4220-995f-e624fed67474\" (UID: \"a0cf27b0-a850-4220-995f-e624fed67474\") " Feb 20 17:23:32 crc kubenswrapper[4697]: I0220 17:23:32.957621 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0cf27b0-a850-4220-995f-e624fed67474-catalog-content\") pod \"a0cf27b0-a850-4220-995f-e624fed67474\" (UID: \"a0cf27b0-a850-4220-995f-e624fed67474\") " Feb 20 17:23:32 crc kubenswrapper[4697]: I0220 17:23:32.957626 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0cf27b0-a850-4220-995f-e624fed67474-utilities" (OuterVolumeSpecName: "utilities") pod "a0cf27b0-a850-4220-995f-e624fed67474" (UID: "a0cf27b0-a850-4220-995f-e624fed67474"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:23:32 crc kubenswrapper[4697]: I0220 17:23:32.958541 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0cf27b0-a850-4220-995f-e624fed67474-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:23:32 crc kubenswrapper[4697]: I0220 17:23:32.966932 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0cf27b0-a850-4220-995f-e624fed67474-kube-api-access-glfmt" (OuterVolumeSpecName: "kube-api-access-glfmt") pod "a0cf27b0-a850-4220-995f-e624fed67474" (UID: "a0cf27b0-a850-4220-995f-e624fed67474"). InnerVolumeSpecName "kube-api-access-glfmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.012192 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0cf27b0-a850-4220-995f-e624fed67474-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0cf27b0-a850-4220-995f-e624fed67474" (UID: "a0cf27b0-a850-4220-995f-e624fed67474"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.061683 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glfmt\" (UniqueName: \"kubernetes.io/projected/a0cf27b0-a850-4220-995f-e624fed67474-kube-api-access-glfmt\") on node \"crc\" DevicePath \"\"" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.061728 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0cf27b0-a850-4220-995f-e624fed67474-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.338002 4697 generic.go:334] "Generic (PLEG): container finished" podID="a0cf27b0-a850-4220-995f-e624fed67474" containerID="16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7" exitCode=0 Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.338044 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9qn" event={"ID":"a0cf27b0-a850-4220-995f-e624fed67474","Type":"ContainerDied","Data":"16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7"} Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.338078 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9qn" event={"ID":"a0cf27b0-a850-4220-995f-e624fed67474","Type":"ContainerDied","Data":"57f1561d573ff65bdb57a54207f24290d5da07f7c648bedcab459c0b7769e725"} Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.338098 4697 scope.go:117] "RemoveContainer" containerID="16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.338119 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kv9qn" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.374461 4697 scope.go:117] "RemoveContainer" containerID="bc337e597a2885e387ace2a98983d6c72fa945da6f8b618e237d6445f1db5fc5" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.392704 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kv9qn"] Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.396279 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kv9qn"] Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.401473 4697 scope.go:117] "RemoveContainer" containerID="b2aa7afcc6ba6621329cac9f22b4e6f095e0e52af3a180234bebd807b2bb5ea9" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.466818 4697 scope.go:117] "RemoveContainer" containerID="16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7" Feb 20 17:23:33 crc kubenswrapper[4697]: E0220 17:23:33.467291 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7\": container with ID starting with 16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7 not found: ID does not exist" containerID="16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.467343 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7"} err="failed to get container status \"16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7\": rpc error: code = NotFound desc = could not find container \"16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7\": container with ID starting with 16928efc336d5cf2a0c38d3c4d65adf4268d47efb24414b139dac2975b5fa0e7 not found: ID does not exist" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.467377 4697 scope.go:117] "RemoveContainer" containerID="bc337e597a2885e387ace2a98983d6c72fa945da6f8b618e237d6445f1db5fc5" Feb 20 17:23:33 crc kubenswrapper[4697]: E0220 17:23:33.467984 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc337e597a2885e387ace2a98983d6c72fa945da6f8b618e237d6445f1db5fc5\": container with ID starting with bc337e597a2885e387ace2a98983d6c72fa945da6f8b618e237d6445f1db5fc5 not found: ID does not exist" containerID="bc337e597a2885e387ace2a98983d6c72fa945da6f8b618e237d6445f1db5fc5" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.468045 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc337e597a2885e387ace2a98983d6c72fa945da6f8b618e237d6445f1db5fc5"} err="failed to get container status \"bc337e597a2885e387ace2a98983d6c72fa945da6f8b618e237d6445f1db5fc5\": rpc error: code = NotFound desc = could not find container \"bc337e597a2885e387ace2a98983d6c72fa945da6f8b618e237d6445f1db5fc5\": container with ID starting with bc337e597a2885e387ace2a98983d6c72fa945da6f8b618e237d6445f1db5fc5 not found: ID does not exist" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.468064 4697 scope.go:117] "RemoveContainer" containerID="b2aa7afcc6ba6621329cac9f22b4e6f095e0e52af3a180234bebd807b2bb5ea9" Feb 20 17:23:33 crc kubenswrapper[4697]: E0220 17:23:33.469899 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2aa7afcc6ba6621329cac9f22b4e6f095e0e52af3a180234bebd807b2bb5ea9\": container with ID starting with b2aa7afcc6ba6621329cac9f22b4e6f095e0e52af3a180234bebd807b2bb5ea9 not found: ID does not exist" containerID="b2aa7afcc6ba6621329cac9f22b4e6f095e0e52af3a180234bebd807b2bb5ea9" Feb 20 17:23:33 crc kubenswrapper[4697]: I0220 17:23:33.469930 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2aa7afcc6ba6621329cac9f22b4e6f095e0e52af3a180234bebd807b2bb5ea9"} err="failed to get container status \"b2aa7afcc6ba6621329cac9f22b4e6f095e0e52af3a180234bebd807b2bb5ea9\": rpc error: code = NotFound desc = could not find container \"b2aa7afcc6ba6621329cac9f22b4e6f095e0e52af3a180234bebd807b2bb5ea9\": container with ID starting with b2aa7afcc6ba6621329cac9f22b4e6f095e0e52af3a180234bebd807b2bb5ea9 not found: ID does not exist" Feb 20 17:23:34 crc kubenswrapper[4697]: I0220 17:23:34.890356 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0cf27b0-a850-4220-995f-e624fed67474" path="/var/lib/kubelet/pods/a0cf27b0-a850-4220-995f-e624fed67474/volumes" Feb 20 17:23:39 crc kubenswrapper[4697]: I0220 17:23:39.878711 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:23:39 crc kubenswrapper[4697]: E0220 17:23:39.880225 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:23:54 crc kubenswrapper[4697]: I0220 17:23:54.878069 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:23:54 crc kubenswrapper[4697]: E0220 17:23:54.879213 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:24:05 crc kubenswrapper[4697]: I0220 17:24:05.877029 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:24:05 crc kubenswrapper[4697]: E0220 17:24:05.878029 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:24:20 crc kubenswrapper[4697]: I0220 17:24:20.876896 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:24:20 crc kubenswrapper[4697]: E0220 17:24:20.877805 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:24:34 crc kubenswrapper[4697]: I0220 17:24:34.877544 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:24:36 crc kubenswrapper[4697]: I0220 17:24:36.037605 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"f73f37b72ce2980f471c790db89bd17d540086f1537271c05d48495337743421"} Feb 20 17:27:01 crc kubenswrapper[4697]: I0220 17:27:01.184784 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:27:01 crc kubenswrapper[4697]: I0220 17:27:01.185540 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:27:31 crc kubenswrapper[4697]: I0220 17:27:31.184739 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:27:31 crc kubenswrapper[4697]: I0220 17:27:31.185267 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:27:58 crc kubenswrapper[4697]: I0220 17:27:58.755021 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tfr2j"] Feb 20 17:27:58 crc kubenswrapper[4697]: E0220 17:27:58.756218 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0cf27b0-a850-4220-995f-e624fed67474" containerName="extract-utilities" Feb 20 17:27:58 crc kubenswrapper[4697]: I0220 17:27:58.756239 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0cf27b0-a850-4220-995f-e624fed67474" containerName="extract-utilities" Feb 20 17:27:58 crc kubenswrapper[4697]: E0220 17:27:58.756273 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0cf27b0-a850-4220-995f-e624fed67474" containerName="registry-server" Feb 20 17:27:58 crc kubenswrapper[4697]: I0220 17:27:58.756280 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0cf27b0-a850-4220-995f-e624fed67474" containerName="registry-server" Feb 20 17:27:58 crc kubenswrapper[4697]: E0220 17:27:58.756317 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0cf27b0-a850-4220-995f-e624fed67474" containerName="extract-content" Feb 20 17:27:58 crc kubenswrapper[4697]: I0220 17:27:58.756326 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0cf27b0-a850-4220-995f-e624fed67474" containerName="extract-content" Feb 20 17:27:58 crc kubenswrapper[4697]: I0220 17:27:58.756594 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0cf27b0-a850-4220-995f-e624fed67474" containerName="registry-server" Feb 20 17:27:58 crc kubenswrapper[4697]: I0220 17:27:58.758469 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:27:58 crc kubenswrapper[4697]: I0220 17:27:58.768949 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tfr2j"] Feb 20 17:27:58 crc kubenswrapper[4697]: I0220 17:27:58.944136 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25pxq\" (UniqueName: \"kubernetes.io/projected/62666ed7-0943-42a6-8ddb-37319044100d-kube-api-access-25pxq\") pod \"community-operators-tfr2j\" (UID: \"62666ed7-0943-42a6-8ddb-37319044100d\") " pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:27:58 crc kubenswrapper[4697]: I0220 17:27:58.944277 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62666ed7-0943-42a6-8ddb-37319044100d-utilities\") pod \"community-operators-tfr2j\" (UID: \"62666ed7-0943-42a6-8ddb-37319044100d\") " pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:27:58 crc kubenswrapper[4697]: I0220 17:27:58.944313 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62666ed7-0943-42a6-8ddb-37319044100d-catalog-content\") pod \"community-operators-tfr2j\" (UID: \"62666ed7-0943-42a6-8ddb-37319044100d\") " pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:27:59 crc kubenswrapper[4697]: I0220 17:27:59.046898 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62666ed7-0943-42a6-8ddb-37319044100d-utilities\") pod \"community-operators-tfr2j\" (UID: \"62666ed7-0943-42a6-8ddb-37319044100d\") " pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:27:59 crc kubenswrapper[4697]: I0220 17:27:59.047001 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62666ed7-0943-42a6-8ddb-37319044100d-catalog-content\") pod \"community-operators-tfr2j\" (UID: \"62666ed7-0943-42a6-8ddb-37319044100d\") " pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:27:59 crc kubenswrapper[4697]: I0220 17:27:59.047225 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25pxq\" (UniqueName: \"kubernetes.io/projected/62666ed7-0943-42a6-8ddb-37319044100d-kube-api-access-25pxq\") pod \"community-operators-tfr2j\" (UID: \"62666ed7-0943-42a6-8ddb-37319044100d\") " pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:27:59 crc kubenswrapper[4697]: I0220 17:27:59.047881 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62666ed7-0943-42a6-8ddb-37319044100d-utilities\") pod \"community-operators-tfr2j\" (UID: \"62666ed7-0943-42a6-8ddb-37319044100d\") " pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:27:59 crc kubenswrapper[4697]: I0220 17:27:59.047954 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62666ed7-0943-42a6-8ddb-37319044100d-catalog-content\") pod \"community-operators-tfr2j\" (UID: \"62666ed7-0943-42a6-8ddb-37319044100d\") " pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:27:59 crc kubenswrapper[4697]: I0220 17:27:59.069001 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25pxq\" (UniqueName: \"kubernetes.io/projected/62666ed7-0943-42a6-8ddb-37319044100d-kube-api-access-25pxq\") pod \"community-operators-tfr2j\" (UID: \"62666ed7-0943-42a6-8ddb-37319044100d\") " pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:27:59 crc kubenswrapper[4697]: I0220 17:27:59.089595 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:27:59 crc kubenswrapper[4697]: I0220 17:27:59.647787 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tfr2j"] Feb 20 17:27:59 crc kubenswrapper[4697]: W0220 17:27:59.653322 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62666ed7_0943_42a6_8ddb_37319044100d.slice/crio-15dad28ff0f001e8c88be387740ef269a653a0583c3e14bcd7f4bdb8772ab7a9 WatchSource:0}: Error finding container 15dad28ff0f001e8c88be387740ef269a653a0583c3e14bcd7f4bdb8772ab7a9: Status 404 returned error can't find the container with id 15dad28ff0f001e8c88be387740ef269a653a0583c3e14bcd7f4bdb8772ab7a9 Feb 20 17:28:00 crc kubenswrapper[4697]: I0220 17:28:00.027805 4697 generic.go:334] "Generic (PLEG): container finished" podID="62666ed7-0943-42a6-8ddb-37319044100d" containerID="23ac8c4247d4b23209b53b3c8743f79c2a7e86d188b3806e4391616c8f443017" exitCode=0 Feb 20 17:28:00 crc kubenswrapper[4697]: I0220 17:28:00.027866 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfr2j" event={"ID":"62666ed7-0943-42a6-8ddb-37319044100d","Type":"ContainerDied","Data":"23ac8c4247d4b23209b53b3c8743f79c2a7e86d188b3806e4391616c8f443017"} Feb 20 17:28:00 crc kubenswrapper[4697]: I0220 17:28:00.028130 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfr2j" event={"ID":"62666ed7-0943-42a6-8ddb-37319044100d","Type":"ContainerStarted","Data":"15dad28ff0f001e8c88be387740ef269a653a0583c3e14bcd7f4bdb8772ab7a9"} Feb 20 17:28:00 crc kubenswrapper[4697]: I0220 17:28:00.031402 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.153747 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-src9q"] Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.156175 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.163407 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-src9q"] Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.184584 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.184637 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.184682 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.185492 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f73f37b72ce2980f471c790db89bd17d540086f1537271c05d48495337743421"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.185542 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://f73f37b72ce2980f471c790db89bd17d540086f1537271c05d48495337743421" gracePeriod=600 Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.213781 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013a568d-ab78-4ce3-922d-9cfa6c79d175-utilities\") pod \"redhat-operators-src9q\" (UID: \"013a568d-ab78-4ce3-922d-9cfa6c79d175\") " pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.214016 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013a568d-ab78-4ce3-922d-9cfa6c79d175-catalog-content\") pod \"redhat-operators-src9q\" (UID: \"013a568d-ab78-4ce3-922d-9cfa6c79d175\") " pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.214250 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj8j6\" (UniqueName: \"kubernetes.io/projected/013a568d-ab78-4ce3-922d-9cfa6c79d175-kube-api-access-rj8j6\") pod \"redhat-operators-src9q\" (UID: \"013a568d-ab78-4ce3-922d-9cfa6c79d175\") " pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.317095 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj8j6\" (UniqueName: \"kubernetes.io/projected/013a568d-ab78-4ce3-922d-9cfa6c79d175-kube-api-access-rj8j6\") pod \"redhat-operators-src9q\" (UID: \"013a568d-ab78-4ce3-922d-9cfa6c79d175\") " pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.317315 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013a568d-ab78-4ce3-922d-9cfa6c79d175-utilities\") pod \"redhat-operators-src9q\" (UID: \"013a568d-ab78-4ce3-922d-9cfa6c79d175\") " pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.317363 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013a568d-ab78-4ce3-922d-9cfa6c79d175-catalog-content\") pod \"redhat-operators-src9q\" (UID: \"013a568d-ab78-4ce3-922d-9cfa6c79d175\") " pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.318014 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013a568d-ab78-4ce3-922d-9cfa6c79d175-utilities\") pod \"redhat-operators-src9q\" (UID: \"013a568d-ab78-4ce3-922d-9cfa6c79d175\") " pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.318127 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013a568d-ab78-4ce3-922d-9cfa6c79d175-catalog-content\") pod \"redhat-operators-src9q\" (UID: \"013a568d-ab78-4ce3-922d-9cfa6c79d175\") " pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.345306 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj8j6\" (UniqueName: \"kubernetes.io/projected/013a568d-ab78-4ce3-922d-9cfa6c79d175-kube-api-access-rj8j6\") pod \"redhat-operators-src9q\" (UID: \"013a568d-ab78-4ce3-922d-9cfa6c79d175\") " pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.480601 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:01 crc kubenswrapper[4697]: I0220 17:28:01.979171 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-src9q"] Feb 20 17:28:02 crc kubenswrapper[4697]: W0220 17:28:02.003762 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod013a568d_ab78_4ce3_922d_9cfa6c79d175.slice/crio-e66e6b5c65b5f632c6229ffeb5a6cefa2f16aa6a6e6815337ec8611c04c8492f WatchSource:0}: Error finding container e66e6b5c65b5f632c6229ffeb5a6cefa2f16aa6a6e6815337ec8611c04c8492f: Status 404 returned error can't find the container with id e66e6b5c65b5f632c6229ffeb5a6cefa2f16aa6a6e6815337ec8611c04c8492f Feb 20 17:28:02 crc kubenswrapper[4697]: I0220 17:28:02.048419 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="f73f37b72ce2980f471c790db89bd17d540086f1537271c05d48495337743421" exitCode=0 Feb 20 17:28:02 crc kubenswrapper[4697]: I0220 17:28:02.048662 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"f73f37b72ce2980f471c790db89bd17d540086f1537271c05d48495337743421"} Feb 20 17:28:02 crc kubenswrapper[4697]: I0220 17:28:02.049104 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb"} Feb 20 17:28:02 crc kubenswrapper[4697]: I0220 17:28:02.049187 4697 scope.go:117] "RemoveContainer" containerID="248fae5e0c27e06715672339b2cc7883c1f9e8e85fd9f971358caa5013788d87" Feb 20 17:28:02 crc kubenswrapper[4697]: I0220 17:28:02.052557 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfr2j" event={"ID":"62666ed7-0943-42a6-8ddb-37319044100d","Type":"ContainerStarted","Data":"0d6abdcd9ee86d506dbe67836864d6f1c0be5bb38433978795d955a663306584"} Feb 20 17:28:02 crc kubenswrapper[4697]: I0220 17:28:02.056245 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-src9q" event={"ID":"013a568d-ab78-4ce3-922d-9cfa6c79d175","Type":"ContainerStarted","Data":"e66e6b5c65b5f632c6229ffeb5a6cefa2f16aa6a6e6815337ec8611c04c8492f"} Feb 20 17:28:03 crc kubenswrapper[4697]: I0220 17:28:03.065908 4697 generic.go:334] "Generic (PLEG): container finished" podID="62666ed7-0943-42a6-8ddb-37319044100d" containerID="0d6abdcd9ee86d506dbe67836864d6f1c0be5bb38433978795d955a663306584" exitCode=0 Feb 20 17:28:03 crc kubenswrapper[4697]: I0220 17:28:03.065995 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfr2j" event={"ID":"62666ed7-0943-42a6-8ddb-37319044100d","Type":"ContainerDied","Data":"0d6abdcd9ee86d506dbe67836864d6f1c0be5bb38433978795d955a663306584"} Feb 20 17:28:03 crc kubenswrapper[4697]: I0220 17:28:03.068456 4697 generic.go:334] "Generic (PLEG): container finished" podID="013a568d-ab78-4ce3-922d-9cfa6c79d175" containerID="ac29ad8d4d9b8d81ad512a338a0a1158a3092e6549241195b0decb73840a8aae" exitCode=0 Feb 20 17:28:03 crc kubenswrapper[4697]: I0220 17:28:03.068508 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-src9q" event={"ID":"013a568d-ab78-4ce3-922d-9cfa6c79d175","Type":"ContainerDied","Data":"ac29ad8d4d9b8d81ad512a338a0a1158a3092e6549241195b0decb73840a8aae"} Feb 20 17:28:04 crc kubenswrapper[4697]: I0220 17:28:04.088476 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfr2j" event={"ID":"62666ed7-0943-42a6-8ddb-37319044100d","Type":"ContainerStarted","Data":"c07171e467c5c1c260c993582591e4c41f2a97381d529a189742d72d33e08579"} Feb 20 17:28:04 crc kubenswrapper[4697]: I0220 17:28:04.093334 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-src9q" event={"ID":"013a568d-ab78-4ce3-922d-9cfa6c79d175","Type":"ContainerStarted","Data":"ae73aa595ab5c568a0f1a6e6851347cb6055310e7b2bbd9718c5037addc2cd65"} Feb 20 17:28:04 crc kubenswrapper[4697]: I0220 17:28:04.111475 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tfr2j" podStartSLOduration=2.705612754 podStartE2EDuration="6.11145621s" podCreationTimestamp="2026-02-20 17:27:58 +0000 UTC" firstStartedPulling="2026-02-20 17:28:00.031021829 +0000 UTC m=+3387.811067277" lastFinishedPulling="2026-02-20 17:28:03.436865335 +0000 UTC m=+3391.216910733" observedRunningTime="2026-02-20 17:28:04.10738228 +0000 UTC m=+3391.887427698" watchObservedRunningTime="2026-02-20 17:28:04.11145621 +0000 UTC m=+3391.891501638" Feb 20 17:28:09 crc kubenswrapper[4697]: I0220 17:28:09.090753 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:28:09 crc kubenswrapper[4697]: I0220 17:28:09.091453 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:28:09 crc kubenswrapper[4697]: I0220 17:28:09.138138 4697 generic.go:334] "Generic (PLEG): container finished" podID="013a568d-ab78-4ce3-922d-9cfa6c79d175" containerID="ae73aa595ab5c568a0f1a6e6851347cb6055310e7b2bbd9718c5037addc2cd65" exitCode=0 Feb 20 17:28:09 crc kubenswrapper[4697]: I0220 17:28:09.138193 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-src9q" event={"ID":"013a568d-ab78-4ce3-922d-9cfa6c79d175","Type":"ContainerDied","Data":"ae73aa595ab5c568a0f1a6e6851347cb6055310e7b2bbd9718c5037addc2cd65"} Feb 20 17:28:10 crc kubenswrapper[4697]: I0220 17:28:10.148749 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-tfr2j" podUID="62666ed7-0943-42a6-8ddb-37319044100d" containerName="registry-server" probeResult="failure" output=< Feb 20 17:28:10 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 17:28:10 crc kubenswrapper[4697]: > Feb 20 17:28:10 crc kubenswrapper[4697]: I0220 17:28:10.149281 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-src9q" event={"ID":"013a568d-ab78-4ce3-922d-9cfa6c79d175","Type":"ContainerStarted","Data":"3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4"} Feb 20 17:28:10 crc kubenswrapper[4697]: I0220 17:28:10.172573 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-src9q" podStartSLOduration=2.440204877 podStartE2EDuration="9.1725568s" podCreationTimestamp="2026-02-20 17:28:01 +0000 UTC" firstStartedPulling="2026-02-20 17:28:03.07020297 +0000 UTC m=+3390.850248378" lastFinishedPulling="2026-02-20 17:28:09.802554893 +0000 UTC m=+3397.582600301" observedRunningTime="2026-02-20 17:28:10.17093524 +0000 UTC m=+3397.950980638" watchObservedRunningTime="2026-02-20 17:28:10.1725568 +0000 UTC m=+3397.952602208" Feb 20 17:28:11 crc kubenswrapper[4697]: I0220 17:28:11.538317 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:11 crc kubenswrapper[4697]: I0220 17:28:11.539405 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:12 crc kubenswrapper[4697]: I0220 17:28:12.601897 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-src9q" podUID="013a568d-ab78-4ce3-922d-9cfa6c79d175" containerName="registry-server" probeResult="failure" output=< Feb 20 17:28:12 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 17:28:12 crc kubenswrapper[4697]: > Feb 20 17:28:19 crc kubenswrapper[4697]: I0220 17:28:19.143033 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:28:19 crc kubenswrapper[4697]: I0220 17:28:19.195046 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:28:19 crc kubenswrapper[4697]: I0220 17:28:19.382239 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tfr2j"] Feb 20 17:28:20 crc kubenswrapper[4697]: I0220 17:28:20.250179 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tfr2j" podUID="62666ed7-0943-42a6-8ddb-37319044100d" containerName="registry-server" containerID="cri-o://c07171e467c5c1c260c993582591e4c41f2a97381d529a189742d72d33e08579" gracePeriod=2 Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.258394 4697 generic.go:334] "Generic (PLEG): container finished" podID="62666ed7-0943-42a6-8ddb-37319044100d" containerID="c07171e467c5c1c260c993582591e4c41f2a97381d529a189742d72d33e08579" exitCode=0 Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.258464 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfr2j" event={"ID":"62666ed7-0943-42a6-8ddb-37319044100d","Type":"ContainerDied","Data":"c07171e467c5c1c260c993582591e4c41f2a97381d529a189742d72d33e08579"} Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.258713 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfr2j" event={"ID":"62666ed7-0943-42a6-8ddb-37319044100d","Type":"ContainerDied","Data":"15dad28ff0f001e8c88be387740ef269a653a0583c3e14bcd7f4bdb8772ab7a9"} Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.258725 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15dad28ff0f001e8c88be387740ef269a653a0583c3e14bcd7f4bdb8772ab7a9" Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.304607 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.338020 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62666ed7-0943-42a6-8ddb-37319044100d-utilities\") pod \"62666ed7-0943-42a6-8ddb-37319044100d\" (UID: \"62666ed7-0943-42a6-8ddb-37319044100d\") " Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.338203 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25pxq\" (UniqueName: \"kubernetes.io/projected/62666ed7-0943-42a6-8ddb-37319044100d-kube-api-access-25pxq\") pod \"62666ed7-0943-42a6-8ddb-37319044100d\" (UID: \"62666ed7-0943-42a6-8ddb-37319044100d\") " Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.338281 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62666ed7-0943-42a6-8ddb-37319044100d-catalog-content\") pod \"62666ed7-0943-42a6-8ddb-37319044100d\" (UID: \"62666ed7-0943-42a6-8ddb-37319044100d\") " Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.339361 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62666ed7-0943-42a6-8ddb-37319044100d-utilities" (OuterVolumeSpecName: "utilities") pod "62666ed7-0943-42a6-8ddb-37319044100d" (UID: "62666ed7-0943-42a6-8ddb-37319044100d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.340927 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62666ed7-0943-42a6-8ddb-37319044100d-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.345751 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62666ed7-0943-42a6-8ddb-37319044100d-kube-api-access-25pxq" (OuterVolumeSpecName: "kube-api-access-25pxq") pod "62666ed7-0943-42a6-8ddb-37319044100d" (UID: "62666ed7-0943-42a6-8ddb-37319044100d"). InnerVolumeSpecName "kube-api-access-25pxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.403990 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62666ed7-0943-42a6-8ddb-37319044100d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62666ed7-0943-42a6-8ddb-37319044100d" (UID: "62666ed7-0943-42a6-8ddb-37319044100d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.443953 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62666ed7-0943-42a6-8ddb-37319044100d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.443990 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25pxq\" (UniqueName: \"kubernetes.io/projected/62666ed7-0943-42a6-8ddb-37319044100d-kube-api-access-25pxq\") on node \"crc\" DevicePath \"\"" Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.534028 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:21 crc kubenswrapper[4697]: I0220 17:28:21.589027 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:22 crc kubenswrapper[4697]: I0220 17:28:22.265273 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tfr2j" Feb 20 17:28:22 crc kubenswrapper[4697]: I0220 17:28:22.299678 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tfr2j"] Feb 20 17:28:22 crc kubenswrapper[4697]: I0220 17:28:22.311478 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tfr2j"] Feb 20 17:28:22 crc kubenswrapper[4697]: I0220 17:28:22.889732 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62666ed7-0943-42a6-8ddb-37319044100d" path="/var/lib/kubelet/pods/62666ed7-0943-42a6-8ddb-37319044100d/volumes" Feb 20 17:28:23 crc kubenswrapper[4697]: I0220 17:28:23.184773 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-src9q"] Feb 20 17:28:23 crc kubenswrapper[4697]: I0220 17:28:23.323161 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-src9q" podUID="013a568d-ab78-4ce3-922d-9cfa6c79d175" containerName="registry-server" containerID="cri-o://3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4" gracePeriod=2 Feb 20 17:28:23 crc kubenswrapper[4697]: I0220 17:28:23.764452 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:23 crc kubenswrapper[4697]: I0220 17:28:23.905097 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013a568d-ab78-4ce3-922d-9cfa6c79d175-catalog-content\") pod \"013a568d-ab78-4ce3-922d-9cfa6c79d175\" (UID: \"013a568d-ab78-4ce3-922d-9cfa6c79d175\") " Feb 20 17:28:23 crc kubenswrapper[4697]: I0220 17:28:23.905156 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013a568d-ab78-4ce3-922d-9cfa6c79d175-utilities\") pod \"013a568d-ab78-4ce3-922d-9cfa6c79d175\" (UID: \"013a568d-ab78-4ce3-922d-9cfa6c79d175\") " Feb 20 17:28:23 crc kubenswrapper[4697]: I0220 17:28:23.905303 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj8j6\" (UniqueName: \"kubernetes.io/projected/013a568d-ab78-4ce3-922d-9cfa6c79d175-kube-api-access-rj8j6\") pod \"013a568d-ab78-4ce3-922d-9cfa6c79d175\" (UID: \"013a568d-ab78-4ce3-922d-9cfa6c79d175\") " Feb 20 17:28:23 crc kubenswrapper[4697]: I0220 17:28:23.906095 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/013a568d-ab78-4ce3-922d-9cfa6c79d175-utilities" (OuterVolumeSpecName: "utilities") pod "013a568d-ab78-4ce3-922d-9cfa6c79d175" (UID: "013a568d-ab78-4ce3-922d-9cfa6c79d175"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:28:23 crc kubenswrapper[4697]: I0220 17:28:23.911693 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013a568d-ab78-4ce3-922d-9cfa6c79d175-kube-api-access-rj8j6" (OuterVolumeSpecName: "kube-api-access-rj8j6") pod "013a568d-ab78-4ce3-922d-9cfa6c79d175" (UID: "013a568d-ab78-4ce3-922d-9cfa6c79d175"). InnerVolumeSpecName "kube-api-access-rj8j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.008615 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013a568d-ab78-4ce3-922d-9cfa6c79d175-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.008654 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj8j6\" (UniqueName: \"kubernetes.io/projected/013a568d-ab78-4ce3-922d-9cfa6c79d175-kube-api-access-rj8j6\") on node \"crc\" DevicePath \"\"" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.032716 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/013a568d-ab78-4ce3-922d-9cfa6c79d175-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "013a568d-ab78-4ce3-922d-9cfa6c79d175" (UID: "013a568d-ab78-4ce3-922d-9cfa6c79d175"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.109993 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013a568d-ab78-4ce3-922d-9cfa6c79d175-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.334191 4697 generic.go:334] "Generic (PLEG): container finished" podID="013a568d-ab78-4ce3-922d-9cfa6c79d175" containerID="3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4" exitCode=0 Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.334249 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-src9q" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.334269 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-src9q" event={"ID":"013a568d-ab78-4ce3-922d-9cfa6c79d175","Type":"ContainerDied","Data":"3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4"} Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.334723 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-src9q" event={"ID":"013a568d-ab78-4ce3-922d-9cfa6c79d175","Type":"ContainerDied","Data":"e66e6b5c65b5f632c6229ffeb5a6cefa2f16aa6a6e6815337ec8611c04c8492f"} Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.334741 4697 scope.go:117] "RemoveContainer" containerID="3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.358552 4697 scope.go:117] "RemoveContainer" containerID="ae73aa595ab5c568a0f1a6e6851347cb6055310e7b2bbd9718c5037addc2cd65" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.391756 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-src9q"] Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.399214 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-src9q"] Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.407207 4697 scope.go:117] "RemoveContainer" containerID="ac29ad8d4d9b8d81ad512a338a0a1158a3092e6549241195b0decb73840a8aae" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.459741 4697 scope.go:117] "RemoveContainer" containerID="3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4" Feb 20 17:28:24 crc kubenswrapper[4697]: E0220 17:28:24.460345 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4\": container with ID starting with 3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4 not found: ID does not exist" containerID="3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.460391 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4"} err="failed to get container status \"3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4\": rpc error: code = NotFound desc = could not find container \"3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4\": container with ID starting with 3adc4bde713fb33f7fdf6f9bbd2f3f343b497cdbb2c1fa4fbf7a0dc1ba92c9e4 not found: ID does not exist" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.460417 4697 scope.go:117] "RemoveContainer" containerID="ae73aa595ab5c568a0f1a6e6851347cb6055310e7b2bbd9718c5037addc2cd65" Feb 20 17:28:24 crc kubenswrapper[4697]: E0220 17:28:24.460851 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae73aa595ab5c568a0f1a6e6851347cb6055310e7b2bbd9718c5037addc2cd65\": container with ID starting with ae73aa595ab5c568a0f1a6e6851347cb6055310e7b2bbd9718c5037addc2cd65 not found: ID does not exist" containerID="ae73aa595ab5c568a0f1a6e6851347cb6055310e7b2bbd9718c5037addc2cd65" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.460889 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae73aa595ab5c568a0f1a6e6851347cb6055310e7b2bbd9718c5037addc2cd65"} err="failed to get container status \"ae73aa595ab5c568a0f1a6e6851347cb6055310e7b2bbd9718c5037addc2cd65\": rpc error: code = NotFound desc = could not find container \"ae73aa595ab5c568a0f1a6e6851347cb6055310e7b2bbd9718c5037addc2cd65\": container with ID starting with ae73aa595ab5c568a0f1a6e6851347cb6055310e7b2bbd9718c5037addc2cd65 not found: ID does not exist" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.460916 4697 scope.go:117] "RemoveContainer" containerID="ac29ad8d4d9b8d81ad512a338a0a1158a3092e6549241195b0decb73840a8aae" Feb 20 17:28:24 crc kubenswrapper[4697]: E0220 17:28:24.461288 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac29ad8d4d9b8d81ad512a338a0a1158a3092e6549241195b0decb73840a8aae\": container with ID starting with ac29ad8d4d9b8d81ad512a338a0a1158a3092e6549241195b0decb73840a8aae not found: ID does not exist" containerID="ac29ad8d4d9b8d81ad512a338a0a1158a3092e6549241195b0decb73840a8aae" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.461321 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac29ad8d4d9b8d81ad512a338a0a1158a3092e6549241195b0decb73840a8aae"} err="failed to get container status \"ac29ad8d4d9b8d81ad512a338a0a1158a3092e6549241195b0decb73840a8aae\": rpc error: code = NotFound desc = could not find container \"ac29ad8d4d9b8d81ad512a338a0a1158a3092e6549241195b0decb73840a8aae\": container with ID starting with ac29ad8d4d9b8d81ad512a338a0a1158a3092e6549241195b0decb73840a8aae not found: ID does not exist" Feb 20 17:28:24 crc kubenswrapper[4697]: I0220 17:28:24.893892 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013a568d-ab78-4ce3-922d-9cfa6c79d175" path="/var/lib/kubelet/pods/013a568d-ab78-4ce3-922d-9cfa6c79d175/volumes" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.153756 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg"] Feb 20 17:30:00 crc kubenswrapper[4697]: E0220 17:30:00.154817 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62666ed7-0943-42a6-8ddb-37319044100d" containerName="extract-utilities" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.154834 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="62666ed7-0943-42a6-8ddb-37319044100d" containerName="extract-utilities" Feb 20 17:30:00 crc kubenswrapper[4697]: E0220 17:30:00.154872 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013a568d-ab78-4ce3-922d-9cfa6c79d175" containerName="registry-server" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.154884 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="013a568d-ab78-4ce3-922d-9cfa6c79d175" containerName="registry-server" Feb 20 17:30:00 crc kubenswrapper[4697]: E0220 17:30:00.154906 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013a568d-ab78-4ce3-922d-9cfa6c79d175" containerName="extract-utilities" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.154916 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="013a568d-ab78-4ce3-922d-9cfa6c79d175" containerName="extract-utilities" Feb 20 17:30:00 crc kubenswrapper[4697]: E0220 17:30:00.154931 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013a568d-ab78-4ce3-922d-9cfa6c79d175" containerName="extract-content" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.154939 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="013a568d-ab78-4ce3-922d-9cfa6c79d175" containerName="extract-content" Feb 20 17:30:00 crc kubenswrapper[4697]: E0220 17:30:00.154957 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62666ed7-0943-42a6-8ddb-37319044100d" containerName="extract-content" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.154965 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="62666ed7-0943-42a6-8ddb-37319044100d" containerName="extract-content" Feb 20 17:30:00 crc kubenswrapper[4697]: E0220 17:30:00.154995 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62666ed7-0943-42a6-8ddb-37319044100d" containerName="registry-server" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.155003 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="62666ed7-0943-42a6-8ddb-37319044100d" containerName="registry-server" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.155223 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="62666ed7-0943-42a6-8ddb-37319044100d" containerName="registry-server" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.155259 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="013a568d-ab78-4ce3-922d-9cfa6c79d175" containerName="registry-server" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.156798 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.163241 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.164563 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.170097 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg"] Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.284612 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31d241f3-bb64-4792-9b76-be72fefb2c4a-config-volume\") pod \"collect-profiles-29526810-4tlgg\" (UID: \"31d241f3-bb64-4792-9b76-be72fefb2c4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.284880 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/31d241f3-bb64-4792-9b76-be72fefb2c4a-secret-volume\") pod \"collect-profiles-29526810-4tlgg\" (UID: \"31d241f3-bb64-4792-9b76-be72fefb2c4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.284921 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhtgc\" (UniqueName: \"kubernetes.io/projected/31d241f3-bb64-4792-9b76-be72fefb2c4a-kube-api-access-lhtgc\") pod \"collect-profiles-29526810-4tlgg\" (UID: \"31d241f3-bb64-4792-9b76-be72fefb2c4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.386797 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/31d241f3-bb64-4792-9b76-be72fefb2c4a-secret-volume\") pod \"collect-profiles-29526810-4tlgg\" (UID: \"31d241f3-bb64-4792-9b76-be72fefb2c4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.386852 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhtgc\" (UniqueName: \"kubernetes.io/projected/31d241f3-bb64-4792-9b76-be72fefb2c4a-kube-api-access-lhtgc\") pod \"collect-profiles-29526810-4tlgg\" (UID: \"31d241f3-bb64-4792-9b76-be72fefb2c4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.386911 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31d241f3-bb64-4792-9b76-be72fefb2c4a-config-volume\") pod \"collect-profiles-29526810-4tlgg\" (UID: \"31d241f3-bb64-4792-9b76-be72fefb2c4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.388025 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31d241f3-bb64-4792-9b76-be72fefb2c4a-config-volume\") pod \"collect-profiles-29526810-4tlgg\" (UID: \"31d241f3-bb64-4792-9b76-be72fefb2c4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.401106 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/31d241f3-bb64-4792-9b76-be72fefb2c4a-secret-volume\") pod \"collect-profiles-29526810-4tlgg\" (UID: \"31d241f3-bb64-4792-9b76-be72fefb2c4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.421071 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhtgc\" (UniqueName: \"kubernetes.io/projected/31d241f3-bb64-4792-9b76-be72fefb2c4a-kube-api-access-lhtgc\") pod \"collect-profiles-29526810-4tlgg\" (UID: \"31d241f3-bb64-4792-9b76-be72fefb2c4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.485787 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:00 crc kubenswrapper[4697]: I0220 17:30:00.951468 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg"] Feb 20 17:30:01 crc kubenswrapper[4697]: I0220 17:30:01.184842 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:30:01 crc kubenswrapper[4697]: I0220 17:30:01.184925 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:30:01 crc kubenswrapper[4697]: I0220 17:30:01.246931 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" event={"ID":"31d241f3-bb64-4792-9b76-be72fefb2c4a","Type":"ContainerStarted","Data":"3ce9fb2c1c631ff55e3ac3151f23ac5d871102a94c4ea94e42152e4ad4a4c8bc"} Feb 20 17:30:01 crc kubenswrapper[4697]: I0220 17:30:01.247013 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" event={"ID":"31d241f3-bb64-4792-9b76-be72fefb2c4a","Type":"ContainerStarted","Data":"d30053d585e050ccae5dea3187704c377c4f3da7e84e821a093f0fd7518df47b"} Feb 20 17:30:01 crc kubenswrapper[4697]: I0220 17:30:01.276821 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" podStartSLOduration=1.27679611 podStartE2EDuration="1.27679611s" podCreationTimestamp="2026-02-20 17:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 17:30:01.266704323 +0000 UTC m=+3509.046749731" watchObservedRunningTime="2026-02-20 17:30:01.27679611 +0000 UTC m=+3509.056841568" Feb 20 17:30:02 crc kubenswrapper[4697]: I0220 17:30:02.255715 4697 generic.go:334] "Generic (PLEG): container finished" podID="31d241f3-bb64-4792-9b76-be72fefb2c4a" containerID="3ce9fb2c1c631ff55e3ac3151f23ac5d871102a94c4ea94e42152e4ad4a4c8bc" exitCode=0 Feb 20 17:30:02 crc kubenswrapper[4697]: I0220 17:30:02.255788 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" event={"ID":"31d241f3-bb64-4792-9b76-be72fefb2c4a","Type":"ContainerDied","Data":"3ce9fb2c1c631ff55e3ac3151f23ac5d871102a94c4ea94e42152e4ad4a4c8bc"} Feb 20 17:30:03 crc kubenswrapper[4697]: I0220 17:30:03.625942 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:03 crc kubenswrapper[4697]: I0220 17:30:03.765954 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhtgc\" (UniqueName: \"kubernetes.io/projected/31d241f3-bb64-4792-9b76-be72fefb2c4a-kube-api-access-lhtgc\") pod \"31d241f3-bb64-4792-9b76-be72fefb2c4a\" (UID: \"31d241f3-bb64-4792-9b76-be72fefb2c4a\") " Feb 20 17:30:03 crc kubenswrapper[4697]: I0220 17:30:03.766033 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/31d241f3-bb64-4792-9b76-be72fefb2c4a-secret-volume\") pod \"31d241f3-bb64-4792-9b76-be72fefb2c4a\" (UID: \"31d241f3-bb64-4792-9b76-be72fefb2c4a\") " Feb 20 17:30:03 crc kubenswrapper[4697]: I0220 17:30:03.766064 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31d241f3-bb64-4792-9b76-be72fefb2c4a-config-volume\") pod \"31d241f3-bb64-4792-9b76-be72fefb2c4a\" (UID: \"31d241f3-bb64-4792-9b76-be72fefb2c4a\") " Feb 20 17:30:03 crc kubenswrapper[4697]: I0220 17:30:03.767209 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d241f3-bb64-4792-9b76-be72fefb2c4a-config-volume" (OuterVolumeSpecName: "config-volume") pod "31d241f3-bb64-4792-9b76-be72fefb2c4a" (UID: "31d241f3-bb64-4792-9b76-be72fefb2c4a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 17:30:03 crc kubenswrapper[4697]: I0220 17:30:03.775645 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d241f3-bb64-4792-9b76-be72fefb2c4a-kube-api-access-lhtgc" (OuterVolumeSpecName: "kube-api-access-lhtgc") pod "31d241f3-bb64-4792-9b76-be72fefb2c4a" (UID: "31d241f3-bb64-4792-9b76-be72fefb2c4a"). InnerVolumeSpecName "kube-api-access-lhtgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:30:03 crc kubenswrapper[4697]: I0220 17:30:03.786284 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d241f3-bb64-4792-9b76-be72fefb2c4a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "31d241f3-bb64-4792-9b76-be72fefb2c4a" (UID: "31d241f3-bb64-4792-9b76-be72fefb2c4a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:30:03 crc kubenswrapper[4697]: I0220 17:30:03.868661 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhtgc\" (UniqueName: \"kubernetes.io/projected/31d241f3-bb64-4792-9b76-be72fefb2c4a-kube-api-access-lhtgc\") on node \"crc\" DevicePath \"\"" Feb 20 17:30:03 crc kubenswrapper[4697]: I0220 17:30:03.868690 4697 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/31d241f3-bb64-4792-9b76-be72fefb2c4a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 17:30:03 crc kubenswrapper[4697]: I0220 17:30:03.868700 4697 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31d241f3-bb64-4792-9b76-be72fefb2c4a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 17:30:04 crc kubenswrapper[4697]: I0220 17:30:04.278750 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" event={"ID":"31d241f3-bb64-4792-9b76-be72fefb2c4a","Type":"ContainerDied","Data":"d30053d585e050ccae5dea3187704c377c4f3da7e84e821a093f0fd7518df47b"} Feb 20 17:30:04 crc kubenswrapper[4697]: I0220 17:30:04.279064 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d30053d585e050ccae5dea3187704c377c4f3da7e84e821a093f0fd7518df47b" Feb 20 17:30:04 crc kubenswrapper[4697]: I0220 17:30:04.278803 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg" Feb 20 17:30:04 crc kubenswrapper[4697]: I0220 17:30:04.353835 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh"] Feb 20 17:30:04 crc kubenswrapper[4697]: I0220 17:30:04.367814 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526765-qhthh"] Feb 20 17:30:04 crc kubenswrapper[4697]: I0220 17:30:04.889565 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3c8641f-25ce-4499-9cfc-ba7c464f2097" path="/var/lib/kubelet/pods/d3c8641f-25ce-4499-9cfc-ba7c464f2097/volumes" Feb 20 17:30:31 crc kubenswrapper[4697]: I0220 17:30:31.184576 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:30:31 crc kubenswrapper[4697]: I0220 17:30:31.185108 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:30:46 crc kubenswrapper[4697]: I0220 17:30:46.744693 4697 scope.go:117] "RemoveContainer" containerID="d821b6203adb7f9f04719048cd1fc9e494b0bab5605a9740d71e93eace5e0b25" Feb 20 17:31:01 crc kubenswrapper[4697]: I0220 17:31:01.184773 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:31:01 crc kubenswrapper[4697]: I0220 17:31:01.185694 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:31:01 crc kubenswrapper[4697]: I0220 17:31:01.185783 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 17:31:01 crc kubenswrapper[4697]: I0220 17:31:01.187104 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 17:31:01 crc kubenswrapper[4697]: I0220 17:31:01.187264 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" gracePeriod=600 Feb 20 17:31:01 crc kubenswrapper[4697]: E0220 17:31:01.314914 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:31:01 crc kubenswrapper[4697]: I0220 17:31:01.828462 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb"} Feb 20 17:31:01 crc kubenswrapper[4697]: I0220 17:31:01.828461 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" exitCode=0 Feb 20 17:31:01 crc kubenswrapper[4697]: I0220 17:31:01.828525 4697 scope.go:117] "RemoveContainer" containerID="f73f37b72ce2980f471c790db89bd17d540086f1537271c05d48495337743421" Feb 20 17:31:01 crc kubenswrapper[4697]: I0220 17:31:01.829293 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:31:01 crc kubenswrapper[4697]: E0220 17:31:01.829668 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:31:10 crc kubenswrapper[4697]: E0220 17:31:10.968961 4697 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Feb 20 17:31:16 crc kubenswrapper[4697]: I0220 17:31:16.877707 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:31:16 crc kubenswrapper[4697]: E0220 17:31:16.879249 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:31:28 crc kubenswrapper[4697]: I0220 17:31:28.877538 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:31:28 crc kubenswrapper[4697]: E0220 17:31:28.878271 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:31:39 crc kubenswrapper[4697]: I0220 17:31:39.878866 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:31:39 crc kubenswrapper[4697]: E0220 17:31:39.881012 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:31:51 crc kubenswrapper[4697]: I0220 17:31:51.877218 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:31:51 crc kubenswrapper[4697]: E0220 17:31:51.878316 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:32:03 crc kubenswrapper[4697]: I0220 17:32:03.878915 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:32:03 crc kubenswrapper[4697]: E0220 17:32:03.880164 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:32:11 crc kubenswrapper[4697]: E0220 17:32:11.984431 4697 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.44:49976->38.102.83.44:39463: write tcp 38.102.83.44:49976->38.102.83.44:39463: write: broken pipe Feb 20 17:32:14 crc kubenswrapper[4697]: I0220 17:32:14.877180 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:32:14 crc kubenswrapper[4697]: E0220 17:32:14.877864 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:32:26 crc kubenswrapper[4697]: I0220 17:32:26.877739 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:32:26 crc kubenswrapper[4697]: E0220 17:32:26.878522 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:32:37 crc kubenswrapper[4697]: I0220 17:32:37.877608 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:32:37 crc kubenswrapper[4697]: E0220 17:32:37.878420 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:32:49 crc kubenswrapper[4697]: I0220 17:32:49.877751 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:32:49 crc kubenswrapper[4697]: E0220 17:32:49.878614 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:33:03 crc kubenswrapper[4697]: I0220 17:33:03.877209 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:33:03 crc kubenswrapper[4697]: E0220 17:33:03.878106 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:33:15 crc kubenswrapper[4697]: I0220 17:33:15.876829 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:33:15 crc kubenswrapper[4697]: E0220 17:33:15.877488 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:33:27 crc kubenswrapper[4697]: I0220 17:33:27.877368 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:33:27 crc kubenswrapper[4697]: E0220 17:33:27.878388 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:33:34 crc kubenswrapper[4697]: I0220 17:33:34.895782 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rs9w7"] Feb 20 17:33:34 crc kubenswrapper[4697]: E0220 17:33:34.896482 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d241f3-bb64-4792-9b76-be72fefb2c4a" containerName="collect-profiles" Feb 20 17:33:34 crc kubenswrapper[4697]: I0220 17:33:34.896493 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d241f3-bb64-4792-9b76-be72fefb2c4a" containerName="collect-profiles" Feb 20 17:33:34 crc kubenswrapper[4697]: I0220 17:33:34.896675 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d241f3-bb64-4792-9b76-be72fefb2c4a" containerName="collect-profiles" Feb 20 17:33:34 crc kubenswrapper[4697]: I0220 17:33:34.898084 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:34 crc kubenswrapper[4697]: I0220 17:33:34.914053 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rs9w7"] Feb 20 17:33:34 crc kubenswrapper[4697]: I0220 17:33:34.989540 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/315ef60b-cb99-4f19-b608-f574def45486-utilities\") pod \"redhat-marketplace-rs9w7\" (UID: \"315ef60b-cb99-4f19-b608-f574def45486\") " pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:34 crc kubenswrapper[4697]: I0220 17:33:34.989623 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dp28\" (UniqueName: \"kubernetes.io/projected/315ef60b-cb99-4f19-b608-f574def45486-kube-api-access-8dp28\") pod \"redhat-marketplace-rs9w7\" (UID: \"315ef60b-cb99-4f19-b608-f574def45486\") " pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:34 crc kubenswrapper[4697]: I0220 17:33:34.989783 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/315ef60b-cb99-4f19-b608-f574def45486-catalog-content\") pod \"redhat-marketplace-rs9w7\" (UID: \"315ef60b-cb99-4f19-b608-f574def45486\") " pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:35 crc kubenswrapper[4697]: I0220 17:33:35.092754 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/315ef60b-cb99-4f19-b608-f574def45486-utilities\") pod \"redhat-marketplace-rs9w7\" (UID: \"315ef60b-cb99-4f19-b608-f574def45486\") " pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:35 crc kubenswrapper[4697]: I0220 17:33:35.093072 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dp28\" (UniqueName: \"kubernetes.io/projected/315ef60b-cb99-4f19-b608-f574def45486-kube-api-access-8dp28\") pod \"redhat-marketplace-rs9w7\" (UID: \"315ef60b-cb99-4f19-b608-f574def45486\") " pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:35 crc kubenswrapper[4697]: I0220 17:33:35.093266 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/315ef60b-cb99-4f19-b608-f574def45486-catalog-content\") pod \"redhat-marketplace-rs9w7\" (UID: \"315ef60b-cb99-4f19-b608-f574def45486\") " pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:35 crc kubenswrapper[4697]: I0220 17:33:35.093294 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/315ef60b-cb99-4f19-b608-f574def45486-utilities\") pod \"redhat-marketplace-rs9w7\" (UID: \"315ef60b-cb99-4f19-b608-f574def45486\") " pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:35 crc kubenswrapper[4697]: I0220 17:33:35.093629 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/315ef60b-cb99-4f19-b608-f574def45486-catalog-content\") pod \"redhat-marketplace-rs9w7\" (UID: \"315ef60b-cb99-4f19-b608-f574def45486\") " pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:35 crc kubenswrapper[4697]: I0220 17:33:35.132418 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dp28\" (UniqueName: \"kubernetes.io/projected/315ef60b-cb99-4f19-b608-f574def45486-kube-api-access-8dp28\") pod \"redhat-marketplace-rs9w7\" (UID: \"315ef60b-cb99-4f19-b608-f574def45486\") " pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:35 crc kubenswrapper[4697]: I0220 17:33:35.267309 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:35 crc kubenswrapper[4697]: I0220 17:33:35.747270 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rs9w7"] Feb 20 17:33:36 crc kubenswrapper[4697]: I0220 17:33:36.477736 4697 generic.go:334] "Generic (PLEG): container finished" podID="315ef60b-cb99-4f19-b608-f574def45486" containerID="2b23e47f153d135c7a4f9a3eda0decc8cd33a2b015fc932cc8ec6b40c8098758" exitCode=0 Feb 20 17:33:36 crc kubenswrapper[4697]: I0220 17:33:36.477798 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs9w7" event={"ID":"315ef60b-cb99-4f19-b608-f574def45486","Type":"ContainerDied","Data":"2b23e47f153d135c7a4f9a3eda0decc8cd33a2b015fc932cc8ec6b40c8098758"} Feb 20 17:33:36 crc kubenswrapper[4697]: I0220 17:33:36.478510 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs9w7" event={"ID":"315ef60b-cb99-4f19-b608-f574def45486","Type":"ContainerStarted","Data":"ec388e2edb4955ca8c7102c83e01be5a77d0636c2e25b69807cf5e4f72a7bfd7"} Feb 20 17:33:36 crc kubenswrapper[4697]: I0220 17:33:36.482794 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 17:33:37 crc kubenswrapper[4697]: I0220 17:33:37.506321 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs9w7" event={"ID":"315ef60b-cb99-4f19-b608-f574def45486","Type":"ContainerStarted","Data":"cda26d9c54b036c7632130207154f7ffc335686150fb5d17f4d9743026a7dbd5"} Feb 20 17:33:38 crc kubenswrapper[4697]: I0220 17:33:38.519406 4697 generic.go:334] "Generic (PLEG): container finished" podID="315ef60b-cb99-4f19-b608-f574def45486" containerID="cda26d9c54b036c7632130207154f7ffc335686150fb5d17f4d9743026a7dbd5" exitCode=0 Feb 20 17:33:38 crc kubenswrapper[4697]: I0220 17:33:38.519522 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs9w7" event={"ID":"315ef60b-cb99-4f19-b608-f574def45486","Type":"ContainerDied","Data":"cda26d9c54b036c7632130207154f7ffc335686150fb5d17f4d9743026a7dbd5"} Feb 20 17:33:39 crc kubenswrapper[4697]: I0220 17:33:39.532652 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs9w7" event={"ID":"315ef60b-cb99-4f19-b608-f574def45486","Type":"ContainerStarted","Data":"a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385"} Feb 20 17:33:39 crc kubenswrapper[4697]: I0220 17:33:39.560038 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rs9w7" podStartSLOduration=3.132473504 podStartE2EDuration="5.560023383s" podCreationTimestamp="2026-02-20 17:33:34 +0000 UTC" firstStartedPulling="2026-02-20 17:33:36.482394756 +0000 UTC m=+3724.262440164" lastFinishedPulling="2026-02-20 17:33:38.909944635 +0000 UTC m=+3726.689990043" observedRunningTime="2026-02-20 17:33:39.556690662 +0000 UTC m=+3727.336736060" watchObservedRunningTime="2026-02-20 17:33:39.560023383 +0000 UTC m=+3727.340068791" Feb 20 17:33:39 crc kubenswrapper[4697]: I0220 17:33:39.877744 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:33:39 crc kubenswrapper[4697]: E0220 17:33:39.878016 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:33:45 crc kubenswrapper[4697]: I0220 17:33:45.267424 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:45 crc kubenswrapper[4697]: I0220 17:33:45.268032 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:45 crc kubenswrapper[4697]: I0220 17:33:45.331739 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:45 crc kubenswrapper[4697]: I0220 17:33:45.641126 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:45 crc kubenswrapper[4697]: I0220 17:33:45.692840 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rs9w7"] Feb 20 17:33:47 crc kubenswrapper[4697]: I0220 17:33:47.616566 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rs9w7" podUID="315ef60b-cb99-4f19-b608-f574def45486" containerName="registry-server" containerID="cri-o://a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385" gracePeriod=2 Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.086037 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.166654 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/315ef60b-cb99-4f19-b608-f574def45486-utilities\") pod \"315ef60b-cb99-4f19-b608-f574def45486\" (UID: \"315ef60b-cb99-4f19-b608-f574def45486\") " Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.166725 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/315ef60b-cb99-4f19-b608-f574def45486-catalog-content\") pod \"315ef60b-cb99-4f19-b608-f574def45486\" (UID: \"315ef60b-cb99-4f19-b608-f574def45486\") " Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.166866 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dp28\" (UniqueName: \"kubernetes.io/projected/315ef60b-cb99-4f19-b608-f574def45486-kube-api-access-8dp28\") pod \"315ef60b-cb99-4f19-b608-f574def45486\" (UID: \"315ef60b-cb99-4f19-b608-f574def45486\") " Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.167547 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315ef60b-cb99-4f19-b608-f574def45486-utilities" (OuterVolumeSpecName: "utilities") pod "315ef60b-cb99-4f19-b608-f574def45486" (UID: "315ef60b-cb99-4f19-b608-f574def45486"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.186134 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315ef60b-cb99-4f19-b608-f574def45486-kube-api-access-8dp28" (OuterVolumeSpecName: "kube-api-access-8dp28") pod "315ef60b-cb99-4f19-b608-f574def45486" (UID: "315ef60b-cb99-4f19-b608-f574def45486"). InnerVolumeSpecName "kube-api-access-8dp28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.194311 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315ef60b-cb99-4f19-b608-f574def45486-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "315ef60b-cb99-4f19-b608-f574def45486" (UID: "315ef60b-cb99-4f19-b608-f574def45486"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.269554 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/315ef60b-cb99-4f19-b608-f574def45486-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.269592 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dp28\" (UniqueName: \"kubernetes.io/projected/315ef60b-cb99-4f19-b608-f574def45486-kube-api-access-8dp28\") on node \"crc\" DevicePath \"\"" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.269605 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/315ef60b-cb99-4f19-b608-f574def45486-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.626692 4697 generic.go:334] "Generic (PLEG): container finished" podID="315ef60b-cb99-4f19-b608-f574def45486" containerID="a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385" exitCode=0 Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.626750 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs9w7" event={"ID":"315ef60b-cb99-4f19-b608-f574def45486","Type":"ContainerDied","Data":"a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385"} Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.627051 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs9w7" event={"ID":"315ef60b-cb99-4f19-b608-f574def45486","Type":"ContainerDied","Data":"ec388e2edb4955ca8c7102c83e01be5a77d0636c2e25b69807cf5e4f72a7bfd7"} Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.627071 4697 scope.go:117] "RemoveContainer" containerID="a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.626760 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rs9w7" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.654890 4697 scope.go:117] "RemoveContainer" containerID="cda26d9c54b036c7632130207154f7ffc335686150fb5d17f4d9743026a7dbd5" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.687415 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rs9w7"] Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.696350 4697 scope.go:117] "RemoveContainer" containerID="2b23e47f153d135c7a4f9a3eda0decc8cd33a2b015fc932cc8ec6b40c8098758" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.698040 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rs9w7"] Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.752938 4697 scope.go:117] "RemoveContainer" containerID="a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385" Feb 20 17:33:48 crc kubenswrapper[4697]: E0220 17:33:48.753418 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385\": container with ID starting with a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385 not found: ID does not exist" containerID="a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.753470 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385"} err="failed to get container status \"a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385\": rpc error: code = NotFound desc = could not find container \"a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385\": container with ID starting with a31a0c80c2435919805b0c5eaa2a08574ea55687ea3d37b343a3521640284385 not found: ID does not exist" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.753499 4697 scope.go:117] "RemoveContainer" containerID="cda26d9c54b036c7632130207154f7ffc335686150fb5d17f4d9743026a7dbd5" Feb 20 17:33:48 crc kubenswrapper[4697]: E0220 17:33:48.753725 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda26d9c54b036c7632130207154f7ffc335686150fb5d17f4d9743026a7dbd5\": container with ID starting with cda26d9c54b036c7632130207154f7ffc335686150fb5d17f4d9743026a7dbd5 not found: ID does not exist" containerID="cda26d9c54b036c7632130207154f7ffc335686150fb5d17f4d9743026a7dbd5" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.753751 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda26d9c54b036c7632130207154f7ffc335686150fb5d17f4d9743026a7dbd5"} err="failed to get container status \"cda26d9c54b036c7632130207154f7ffc335686150fb5d17f4d9743026a7dbd5\": rpc error: code = NotFound desc = could not find container \"cda26d9c54b036c7632130207154f7ffc335686150fb5d17f4d9743026a7dbd5\": container with ID starting with cda26d9c54b036c7632130207154f7ffc335686150fb5d17f4d9743026a7dbd5 not found: ID does not exist" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.753766 4697 scope.go:117] "RemoveContainer" containerID="2b23e47f153d135c7a4f9a3eda0decc8cd33a2b015fc932cc8ec6b40c8098758" Feb 20 17:33:48 crc kubenswrapper[4697]: E0220 17:33:48.757750 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b23e47f153d135c7a4f9a3eda0decc8cd33a2b015fc932cc8ec6b40c8098758\": container with ID starting with 2b23e47f153d135c7a4f9a3eda0decc8cd33a2b015fc932cc8ec6b40c8098758 not found: ID does not exist" containerID="2b23e47f153d135c7a4f9a3eda0decc8cd33a2b015fc932cc8ec6b40c8098758" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.757785 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b23e47f153d135c7a4f9a3eda0decc8cd33a2b015fc932cc8ec6b40c8098758"} err="failed to get container status \"2b23e47f153d135c7a4f9a3eda0decc8cd33a2b015fc932cc8ec6b40c8098758\": rpc error: code = NotFound desc = could not find container \"2b23e47f153d135c7a4f9a3eda0decc8cd33a2b015fc932cc8ec6b40c8098758\": container with ID starting with 2b23e47f153d135c7a4f9a3eda0decc8cd33a2b015fc932cc8ec6b40c8098758 not found: ID does not exist" Feb 20 17:33:48 crc kubenswrapper[4697]: I0220 17:33:48.894104 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315ef60b-cb99-4f19-b608-f574def45486" path="/var/lib/kubelet/pods/315ef60b-cb99-4f19-b608-f574def45486/volumes" Feb 20 17:33:51 crc kubenswrapper[4697]: I0220 17:33:51.877922 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:33:51 crc kubenswrapper[4697]: E0220 17:33:51.878649 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:34:03 crc kubenswrapper[4697]: I0220 17:34:03.892171 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:34:03 crc kubenswrapper[4697]: E0220 17:34:03.894969 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:34:14 crc kubenswrapper[4697]: I0220 17:34:14.877687 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:34:14 crc kubenswrapper[4697]: E0220 17:34:14.878971 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:34:25 crc kubenswrapper[4697]: I0220 17:34:25.876525 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:34:25 crc kubenswrapper[4697]: E0220 17:34:25.877370 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:34:38 crc kubenswrapper[4697]: I0220 17:34:38.877278 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:34:38 crc kubenswrapper[4697]: E0220 17:34:38.878204 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:34:46 crc kubenswrapper[4697]: I0220 17:34:46.873512 4697 scope.go:117] "RemoveContainer" containerID="c07171e467c5c1c260c993582591e4c41f2a97381d529a189742d72d33e08579" Feb 20 17:34:46 crc kubenswrapper[4697]: I0220 17:34:46.903654 4697 scope.go:117] "RemoveContainer" containerID="0d6abdcd9ee86d506dbe67836864d6f1c0be5bb38433978795d955a663306584" Feb 20 17:34:46 crc kubenswrapper[4697]: I0220 17:34:46.928987 4697 scope.go:117] "RemoveContainer" containerID="23ac8c4247d4b23209b53b3c8743f79c2a7e86d188b3806e4391616c8f443017" Feb 20 17:34:53 crc kubenswrapper[4697]: I0220 17:34:53.876810 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:34:53 crc kubenswrapper[4697]: E0220 17:34:53.878383 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:35:08 crc kubenswrapper[4697]: I0220 17:35:08.877931 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:35:08 crc kubenswrapper[4697]: E0220 17:35:08.878802 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:35:19 crc kubenswrapper[4697]: I0220 17:35:19.877259 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:35:19 crc kubenswrapper[4697]: E0220 17:35:19.878004 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:35:33 crc kubenswrapper[4697]: I0220 17:35:33.876919 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:35:33 crc kubenswrapper[4697]: E0220 17:35:33.877913 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:35:45 crc kubenswrapper[4697]: I0220 17:35:45.876974 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:35:45 crc kubenswrapper[4697]: E0220 17:35:45.877853 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:35:56 crc kubenswrapper[4697]: I0220 17:35:56.879184 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:35:56 crc kubenswrapper[4697]: E0220 17:35:56.880417 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:36:07 crc kubenswrapper[4697]: I0220 17:36:07.877842 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:36:08 crc kubenswrapper[4697]: I0220 17:36:08.220573 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"06d16abf023e436e56d3f9256ee36737fec7b615b838934ac0a5478653ced746"} Feb 20 17:38:31 crc kubenswrapper[4697]: I0220 17:38:31.185075 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:38:31 crc kubenswrapper[4697]: I0220 17:38:31.185656 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:39:01 crc kubenswrapper[4697]: I0220 17:39:01.185080 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:39:01 crc kubenswrapper[4697]: I0220 17:39:01.185553 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:39:31 crc kubenswrapper[4697]: I0220 17:39:31.185260 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:39:31 crc kubenswrapper[4697]: I0220 17:39:31.186249 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:39:31 crc kubenswrapper[4697]: I0220 17:39:31.186305 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 17:39:31 crc kubenswrapper[4697]: I0220 17:39:31.187658 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"06d16abf023e436e56d3f9256ee36737fec7b615b838934ac0a5478653ced746"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 17:39:31 crc kubenswrapper[4697]: I0220 17:39:31.187737 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://06d16abf023e436e56d3f9256ee36737fec7b615b838934ac0a5478653ced746" gracePeriod=600 Feb 20 17:39:32 crc kubenswrapper[4697]: I0220 17:39:32.185122 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="06d16abf023e436e56d3f9256ee36737fec7b615b838934ac0a5478653ced746" exitCode=0 Feb 20 17:39:32 crc kubenswrapper[4697]: I0220 17:39:32.185215 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"06d16abf023e436e56d3f9256ee36737fec7b615b838934ac0a5478653ced746"} Feb 20 17:39:32 crc kubenswrapper[4697]: I0220 17:39:32.186446 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b"} Feb 20 17:39:32 crc kubenswrapper[4697]: I0220 17:39:32.186488 4697 scope.go:117] "RemoveContainer" containerID="1f967eb2a340976e8557647a73008c79aba06fc0ef8b569c35f3cca55bc1eefb" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.186814 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l5cdg"] Feb 20 17:40:34 crc kubenswrapper[4697]: E0220 17:40:34.188015 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315ef60b-cb99-4f19-b608-f574def45486" containerName="extract-utilities" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.188036 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="315ef60b-cb99-4f19-b608-f574def45486" containerName="extract-utilities" Feb 20 17:40:34 crc kubenswrapper[4697]: E0220 17:40:34.188067 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315ef60b-cb99-4f19-b608-f574def45486" containerName="extract-content" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.188076 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="315ef60b-cb99-4f19-b608-f574def45486" containerName="extract-content" Feb 20 17:40:34 crc kubenswrapper[4697]: E0220 17:40:34.188107 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315ef60b-cb99-4f19-b608-f574def45486" containerName="registry-server" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.188117 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="315ef60b-cb99-4f19-b608-f574def45486" containerName="registry-server" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.188387 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="315ef60b-cb99-4f19-b608-f574def45486" containerName="registry-server" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.190404 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.219393 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l5cdg"] Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.366154 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7mnk\" (UniqueName: \"kubernetes.io/projected/50352784-9415-4277-9122-aca9457a8363-kube-api-access-b7mnk\") pod \"redhat-operators-l5cdg\" (UID: \"50352784-9415-4277-9122-aca9457a8363\") " pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.366206 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50352784-9415-4277-9122-aca9457a8363-utilities\") pod \"redhat-operators-l5cdg\" (UID: \"50352784-9415-4277-9122-aca9457a8363\") " pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.366291 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50352784-9415-4277-9122-aca9457a8363-catalog-content\") pod \"redhat-operators-l5cdg\" (UID: \"50352784-9415-4277-9122-aca9457a8363\") " pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.468675 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7mnk\" (UniqueName: \"kubernetes.io/projected/50352784-9415-4277-9122-aca9457a8363-kube-api-access-b7mnk\") pod \"redhat-operators-l5cdg\" (UID: \"50352784-9415-4277-9122-aca9457a8363\") " pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.469025 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50352784-9415-4277-9122-aca9457a8363-utilities\") pod \"redhat-operators-l5cdg\" (UID: \"50352784-9415-4277-9122-aca9457a8363\") " pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.469261 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50352784-9415-4277-9122-aca9457a8363-catalog-content\") pod \"redhat-operators-l5cdg\" (UID: \"50352784-9415-4277-9122-aca9457a8363\") " pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.469669 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50352784-9415-4277-9122-aca9457a8363-utilities\") pod \"redhat-operators-l5cdg\" (UID: \"50352784-9415-4277-9122-aca9457a8363\") " pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.469856 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50352784-9415-4277-9122-aca9457a8363-catalog-content\") pod \"redhat-operators-l5cdg\" (UID: \"50352784-9415-4277-9122-aca9457a8363\") " pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.505763 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7mnk\" (UniqueName: \"kubernetes.io/projected/50352784-9415-4277-9122-aca9457a8363-kube-api-access-b7mnk\") pod \"redhat-operators-l5cdg\" (UID: \"50352784-9415-4277-9122-aca9457a8363\") " pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:34 crc kubenswrapper[4697]: I0220 17:40:34.531374 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:35 crc kubenswrapper[4697]: I0220 17:40:35.088841 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l5cdg"] Feb 20 17:40:35 crc kubenswrapper[4697]: E0220 17:40:35.514727 4697 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50352784_9415_4277_9122_aca9457a8363.slice/crio-7149a3312c69bb93fc8aed418b693787b7c15781b645cc93fb6d0f13400458cb.scope\": RecentStats: unable to find data in memory cache]" Feb 20 17:40:35 crc kubenswrapper[4697]: I0220 17:40:35.873231 4697 generic.go:334] "Generic (PLEG): container finished" podID="50352784-9415-4277-9122-aca9457a8363" containerID="7149a3312c69bb93fc8aed418b693787b7c15781b645cc93fb6d0f13400458cb" exitCode=0 Feb 20 17:40:35 crc kubenswrapper[4697]: I0220 17:40:35.873759 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5cdg" event={"ID":"50352784-9415-4277-9122-aca9457a8363","Type":"ContainerDied","Data":"7149a3312c69bb93fc8aed418b693787b7c15781b645cc93fb6d0f13400458cb"} Feb 20 17:40:35 crc kubenswrapper[4697]: I0220 17:40:35.873794 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5cdg" event={"ID":"50352784-9415-4277-9122-aca9457a8363","Type":"ContainerStarted","Data":"f0bf20b9be1c9b13c53a523f2a16df5645ad764480b67d9fd7c862b46f72ea06"} Feb 20 17:40:35 crc kubenswrapper[4697]: I0220 17:40:35.876355 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.586295 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-np6pn"] Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.589351 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.597712 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-np6pn"] Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.759101 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4210e50a-a717-4c84-a1b7-6b50628cdc50-catalog-content\") pod \"certified-operators-np6pn\" (UID: \"4210e50a-a717-4c84-a1b7-6b50628cdc50\") " pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.759160 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4210e50a-a717-4c84-a1b7-6b50628cdc50-utilities\") pod \"certified-operators-np6pn\" (UID: \"4210e50a-a717-4c84-a1b7-6b50628cdc50\") " pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.759241 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twmw4\" (UniqueName: \"kubernetes.io/projected/4210e50a-a717-4c84-a1b7-6b50628cdc50-kube-api-access-twmw4\") pod \"certified-operators-np6pn\" (UID: \"4210e50a-a717-4c84-a1b7-6b50628cdc50\") " pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.776771 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jcfjz"] Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.778698 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.802332 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jcfjz"] Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.860868 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4210e50a-a717-4c84-a1b7-6b50628cdc50-catalog-content\") pod \"certified-operators-np6pn\" (UID: \"4210e50a-a717-4c84-a1b7-6b50628cdc50\") " pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.860940 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4210e50a-a717-4c84-a1b7-6b50628cdc50-utilities\") pod \"certified-operators-np6pn\" (UID: \"4210e50a-a717-4c84-a1b7-6b50628cdc50\") " pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.861048 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twmw4\" (UniqueName: \"kubernetes.io/projected/4210e50a-a717-4c84-a1b7-6b50628cdc50-kube-api-access-twmw4\") pod \"certified-operators-np6pn\" (UID: \"4210e50a-a717-4c84-a1b7-6b50628cdc50\") " pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.861867 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4210e50a-a717-4c84-a1b7-6b50628cdc50-utilities\") pod \"certified-operators-np6pn\" (UID: \"4210e50a-a717-4c84-a1b7-6b50628cdc50\") " pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.861868 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4210e50a-a717-4c84-a1b7-6b50628cdc50-catalog-content\") pod \"certified-operators-np6pn\" (UID: \"4210e50a-a717-4c84-a1b7-6b50628cdc50\") " pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.891831 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twmw4\" (UniqueName: \"kubernetes.io/projected/4210e50a-a717-4c84-a1b7-6b50628cdc50-kube-api-access-twmw4\") pod \"certified-operators-np6pn\" (UID: \"4210e50a-a717-4c84-a1b7-6b50628cdc50\") " pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.963464 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg2t5\" (UniqueName: \"kubernetes.io/projected/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-kube-api-access-gg2t5\") pod \"community-operators-jcfjz\" (UID: \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\") " pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.963547 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-catalog-content\") pod \"community-operators-jcfjz\" (UID: \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\") " pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.963611 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-utilities\") pod \"community-operators-jcfjz\" (UID: \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\") " pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:36 crc kubenswrapper[4697]: I0220 17:40:36.973185 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:37 crc kubenswrapper[4697]: I0220 17:40:37.066809 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg2t5\" (UniqueName: \"kubernetes.io/projected/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-kube-api-access-gg2t5\") pod \"community-operators-jcfjz\" (UID: \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\") " pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:37 crc kubenswrapper[4697]: I0220 17:40:37.067337 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-catalog-content\") pod \"community-operators-jcfjz\" (UID: \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\") " pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:37 crc kubenswrapper[4697]: I0220 17:40:37.067382 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-utilities\") pod \"community-operators-jcfjz\" (UID: \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\") " pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:37 crc kubenswrapper[4697]: I0220 17:40:37.068096 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-utilities\") pod \"community-operators-jcfjz\" (UID: \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\") " pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:37 crc kubenswrapper[4697]: I0220 17:40:37.069421 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-catalog-content\") pod \"community-operators-jcfjz\" (UID: \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\") " pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:37 crc kubenswrapper[4697]: I0220 17:40:37.088746 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg2t5\" (UniqueName: \"kubernetes.io/projected/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-kube-api-access-gg2t5\") pod \"community-operators-jcfjz\" (UID: \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\") " pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:37 crc kubenswrapper[4697]: I0220 17:40:37.095562 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:37 crc kubenswrapper[4697]: I0220 17:40:37.605977 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-np6pn"] Feb 20 17:40:37 crc kubenswrapper[4697]: I0220 17:40:37.745846 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jcfjz"] Feb 20 17:40:37 crc kubenswrapper[4697]: I0220 17:40:37.921017 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5cdg" event={"ID":"50352784-9415-4277-9122-aca9457a8363","Type":"ContainerStarted","Data":"0fd9d5e16958fac47148434e7849e601b207e6a9c22f2a942d644cc251387bb4"} Feb 20 17:40:38 crc kubenswrapper[4697]: I0220 17:40:38.947210 4697 generic.go:334] "Generic (PLEG): container finished" podID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" containerID="e7faa31a3f25b86aec7f9e4a4976af0243c9cc578cfbdebe6bfe7a22c8cb0c8b" exitCode=0 Feb 20 17:40:38 crc kubenswrapper[4697]: I0220 17:40:38.947883 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcfjz" event={"ID":"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16","Type":"ContainerDied","Data":"e7faa31a3f25b86aec7f9e4a4976af0243c9cc578cfbdebe6bfe7a22c8cb0c8b"} Feb 20 17:40:38 crc kubenswrapper[4697]: I0220 17:40:38.947916 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcfjz" event={"ID":"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16","Type":"ContainerStarted","Data":"4bc5f45ce39583c95a8af1726be257b38ff0bd0adf6d5015ea01571c66d8a723"} Feb 20 17:40:38 crc kubenswrapper[4697]: I0220 17:40:38.955442 4697 generic.go:334] "Generic (PLEG): container finished" podID="4210e50a-a717-4c84-a1b7-6b50628cdc50" containerID="881a8eb12bffb9a73888ff9f0568b3051aa0f7695c0b1af0f79ae303704b0321" exitCode=0 Feb 20 17:40:38 crc kubenswrapper[4697]: I0220 17:40:38.956773 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np6pn" event={"ID":"4210e50a-a717-4c84-a1b7-6b50628cdc50","Type":"ContainerDied","Data":"881a8eb12bffb9a73888ff9f0568b3051aa0f7695c0b1af0f79ae303704b0321"} Feb 20 17:40:38 crc kubenswrapper[4697]: I0220 17:40:38.956822 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np6pn" event={"ID":"4210e50a-a717-4c84-a1b7-6b50628cdc50","Type":"ContainerStarted","Data":"925a8cebfea64aba14920405d861b5efa218bb31b58d7f1bfa6eab7d4e368c2d"} Feb 20 17:40:40 crc kubenswrapper[4697]: I0220 17:40:40.982663 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcfjz" event={"ID":"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16","Type":"ContainerStarted","Data":"72860e66ead3d9aac935292d1028ee24608530df620e654b580b777fd1772b82"} Feb 20 17:40:40 crc kubenswrapper[4697]: I0220 17:40:40.986221 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np6pn" event={"ID":"4210e50a-a717-4c84-a1b7-6b50628cdc50","Type":"ContainerStarted","Data":"b00872eba5ec7fc54dc3f5cd016736547a0a279822b2daf48953ab7bf4084009"} Feb 20 17:40:42 crc kubenswrapper[4697]: I0220 17:40:42.002617 4697 generic.go:334] "Generic (PLEG): container finished" podID="50352784-9415-4277-9122-aca9457a8363" containerID="0fd9d5e16958fac47148434e7849e601b207e6a9c22f2a942d644cc251387bb4" exitCode=0 Feb 20 17:40:42 crc kubenswrapper[4697]: I0220 17:40:42.002690 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5cdg" event={"ID":"50352784-9415-4277-9122-aca9457a8363","Type":"ContainerDied","Data":"0fd9d5e16958fac47148434e7849e601b207e6a9c22f2a942d644cc251387bb4"} Feb 20 17:40:44 crc kubenswrapper[4697]: I0220 17:40:44.038228 4697 generic.go:334] "Generic (PLEG): container finished" podID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" containerID="72860e66ead3d9aac935292d1028ee24608530df620e654b580b777fd1772b82" exitCode=0 Feb 20 17:40:44 crc kubenswrapper[4697]: I0220 17:40:44.038357 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcfjz" event={"ID":"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16","Type":"ContainerDied","Data":"72860e66ead3d9aac935292d1028ee24608530df620e654b580b777fd1772b82"} Feb 20 17:40:44 crc kubenswrapper[4697]: I0220 17:40:44.052292 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5cdg" event={"ID":"50352784-9415-4277-9122-aca9457a8363","Type":"ContainerStarted","Data":"01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7"} Feb 20 17:40:44 crc kubenswrapper[4697]: I0220 17:40:44.056882 4697 generic.go:334] "Generic (PLEG): container finished" podID="4210e50a-a717-4c84-a1b7-6b50628cdc50" containerID="b00872eba5ec7fc54dc3f5cd016736547a0a279822b2daf48953ab7bf4084009" exitCode=0 Feb 20 17:40:44 crc kubenswrapper[4697]: I0220 17:40:44.056961 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np6pn" event={"ID":"4210e50a-a717-4c84-a1b7-6b50628cdc50","Type":"ContainerDied","Data":"b00872eba5ec7fc54dc3f5cd016736547a0a279822b2daf48953ab7bf4084009"} Feb 20 17:40:44 crc kubenswrapper[4697]: I0220 17:40:44.145773 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l5cdg" podStartSLOduration=3.3617149250000002 podStartE2EDuration="10.145728237s" podCreationTimestamp="2026-02-20 17:40:34 +0000 UTC" firstStartedPulling="2026-02-20 17:40:35.876064769 +0000 UTC m=+4143.656110187" lastFinishedPulling="2026-02-20 17:40:42.660078081 +0000 UTC m=+4150.440123499" observedRunningTime="2026-02-20 17:40:44.137155868 +0000 UTC m=+4151.917201276" watchObservedRunningTime="2026-02-20 17:40:44.145728237 +0000 UTC m=+4151.925773675" Feb 20 17:40:44 crc kubenswrapper[4697]: I0220 17:40:44.532230 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:44 crc kubenswrapper[4697]: I0220 17:40:44.532497 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:40:45 crc kubenswrapper[4697]: I0220 17:40:45.071195 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np6pn" event={"ID":"4210e50a-a717-4c84-a1b7-6b50628cdc50","Type":"ContainerStarted","Data":"07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76"} Feb 20 17:40:45 crc kubenswrapper[4697]: I0220 17:40:45.074723 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcfjz" event={"ID":"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16","Type":"ContainerStarted","Data":"efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0"} Feb 20 17:40:45 crc kubenswrapper[4697]: I0220 17:40:45.106224 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-np6pn" podStartSLOduration=3.61144738 podStartE2EDuration="9.106178327s" podCreationTimestamp="2026-02-20 17:40:36 +0000 UTC" firstStartedPulling="2026-02-20 17:40:38.959140821 +0000 UTC m=+4146.739186229" lastFinishedPulling="2026-02-20 17:40:44.453871758 +0000 UTC m=+4152.233917176" observedRunningTime="2026-02-20 17:40:45.09029831 +0000 UTC m=+4152.870343728" watchObservedRunningTime="2026-02-20 17:40:45.106178327 +0000 UTC m=+4152.886223745" Feb 20 17:40:45 crc kubenswrapper[4697]: I0220 17:40:45.122318 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jcfjz" podStartSLOduration=3.639040674 podStartE2EDuration="9.122298611s" podCreationTimestamp="2026-02-20 17:40:36 +0000 UTC" firstStartedPulling="2026-02-20 17:40:38.95091391 +0000 UTC m=+4146.730959318" lastFinishedPulling="2026-02-20 17:40:44.434171837 +0000 UTC m=+4152.214217255" observedRunningTime="2026-02-20 17:40:45.119294248 +0000 UTC m=+4152.899339656" watchObservedRunningTime="2026-02-20 17:40:45.122298611 +0000 UTC m=+4152.902344019" Feb 20 17:40:45 crc kubenswrapper[4697]: I0220 17:40:45.597548 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l5cdg" podUID="50352784-9415-4277-9122-aca9457a8363" containerName="registry-server" probeResult="failure" output=< Feb 20 17:40:45 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 17:40:45 crc kubenswrapper[4697]: > Feb 20 17:40:46 crc kubenswrapper[4697]: I0220 17:40:46.975342 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:46 crc kubenswrapper[4697]: I0220 17:40:46.976882 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:47 crc kubenswrapper[4697]: I0220 17:40:47.095773 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:47 crc kubenswrapper[4697]: I0220 17:40:47.095826 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:48 crc kubenswrapper[4697]: I0220 17:40:48.032085 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-np6pn" podUID="4210e50a-a717-4c84-a1b7-6b50628cdc50" containerName="registry-server" probeResult="failure" output=< Feb 20 17:40:48 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 17:40:48 crc kubenswrapper[4697]: > Feb 20 17:40:48 crc kubenswrapper[4697]: I0220 17:40:48.156779 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jcfjz" podUID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" containerName="registry-server" probeResult="failure" output=< Feb 20 17:40:48 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 17:40:48 crc kubenswrapper[4697]: > Feb 20 17:40:55 crc kubenswrapper[4697]: I0220 17:40:55.582075 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l5cdg" podUID="50352784-9415-4277-9122-aca9457a8363" containerName="registry-server" probeResult="failure" output=< Feb 20 17:40:55 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 17:40:55 crc kubenswrapper[4697]: > Feb 20 17:40:57 crc kubenswrapper[4697]: I0220 17:40:57.043744 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:57 crc kubenswrapper[4697]: I0220 17:40:57.106390 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:57 crc kubenswrapper[4697]: I0220 17:40:57.153117 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:57 crc kubenswrapper[4697]: I0220 17:40:57.200297 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:40:57 crc kubenswrapper[4697]: I0220 17:40:57.284494 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-np6pn"] Feb 20 17:40:58 crc kubenswrapper[4697]: I0220 17:40:58.204705 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-np6pn" podUID="4210e50a-a717-4c84-a1b7-6b50628cdc50" containerName="registry-server" containerID="cri-o://07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76" gracePeriod=2 Feb 20 17:40:58 crc kubenswrapper[4697]: I0220 17:40:58.685308 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:58 crc kubenswrapper[4697]: I0220 17:40:58.702117 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4210e50a-a717-4c84-a1b7-6b50628cdc50-utilities\") pod \"4210e50a-a717-4c84-a1b7-6b50628cdc50\" (UID: \"4210e50a-a717-4c84-a1b7-6b50628cdc50\") " Feb 20 17:40:58 crc kubenswrapper[4697]: I0220 17:40:58.702299 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twmw4\" (UniqueName: \"kubernetes.io/projected/4210e50a-a717-4c84-a1b7-6b50628cdc50-kube-api-access-twmw4\") pod \"4210e50a-a717-4c84-a1b7-6b50628cdc50\" (UID: \"4210e50a-a717-4c84-a1b7-6b50628cdc50\") " Feb 20 17:40:58 crc kubenswrapper[4697]: I0220 17:40:58.702516 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4210e50a-a717-4c84-a1b7-6b50628cdc50-catalog-content\") pod \"4210e50a-a717-4c84-a1b7-6b50628cdc50\" (UID: \"4210e50a-a717-4c84-a1b7-6b50628cdc50\") " Feb 20 17:40:58 crc kubenswrapper[4697]: I0220 17:40:58.703140 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4210e50a-a717-4c84-a1b7-6b50628cdc50-utilities" (OuterVolumeSpecName: "utilities") pod "4210e50a-a717-4c84-a1b7-6b50628cdc50" (UID: "4210e50a-a717-4c84-a1b7-6b50628cdc50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:40:58 crc kubenswrapper[4697]: I0220 17:40:58.706498 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4210e50a-a717-4c84-a1b7-6b50628cdc50-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:40:58 crc kubenswrapper[4697]: I0220 17:40:58.712671 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4210e50a-a717-4c84-a1b7-6b50628cdc50-kube-api-access-twmw4" (OuterVolumeSpecName: "kube-api-access-twmw4") pod "4210e50a-a717-4c84-a1b7-6b50628cdc50" (UID: "4210e50a-a717-4c84-a1b7-6b50628cdc50"). InnerVolumeSpecName "kube-api-access-twmw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:40:58 crc kubenswrapper[4697]: I0220 17:40:58.774459 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4210e50a-a717-4c84-a1b7-6b50628cdc50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4210e50a-a717-4c84-a1b7-6b50628cdc50" (UID: "4210e50a-a717-4c84-a1b7-6b50628cdc50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:40:58 crc kubenswrapper[4697]: I0220 17:40:58.808749 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twmw4\" (UniqueName: \"kubernetes.io/projected/4210e50a-a717-4c84-a1b7-6b50628cdc50-kube-api-access-twmw4\") on node \"crc\" DevicePath \"\"" Feb 20 17:40:58 crc kubenswrapper[4697]: I0220 17:40:58.808816 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4210e50a-a717-4c84-a1b7-6b50628cdc50-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.216367 4697 generic.go:334] "Generic (PLEG): container finished" podID="4210e50a-a717-4c84-a1b7-6b50628cdc50" containerID="07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76" exitCode=0 Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.216458 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-np6pn" Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.216484 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np6pn" event={"ID":"4210e50a-a717-4c84-a1b7-6b50628cdc50","Type":"ContainerDied","Data":"07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76"} Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.216553 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np6pn" event={"ID":"4210e50a-a717-4c84-a1b7-6b50628cdc50","Type":"ContainerDied","Data":"925a8cebfea64aba14920405d861b5efa218bb31b58d7f1bfa6eab7d4e368c2d"} Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.216596 4697 scope.go:117] "RemoveContainer" containerID="07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76" Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.250087 4697 scope.go:117] "RemoveContainer" containerID="b00872eba5ec7fc54dc3f5cd016736547a0a279822b2daf48953ab7bf4084009" Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.262371 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-np6pn"] Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.277174 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-np6pn"] Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.281788 4697 scope.go:117] "RemoveContainer" containerID="881a8eb12bffb9a73888ff9f0568b3051aa0f7695c0b1af0f79ae303704b0321" Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.340826 4697 scope.go:117] "RemoveContainer" containerID="07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76" Feb 20 17:40:59 crc kubenswrapper[4697]: E0220 17:40:59.341620 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76\": container with ID starting with 07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76 not found: ID does not exist" containerID="07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76" Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.341829 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76"} err="failed to get container status \"07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76\": rpc error: code = NotFound desc = could not find container \"07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76\": container with ID starting with 07cf58f4384c615ec6fe25e5a31ec606d471f8040a66e86eb5b04859ff96ec76 not found: ID does not exist" Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.341936 4697 scope.go:117] "RemoveContainer" containerID="b00872eba5ec7fc54dc3f5cd016736547a0a279822b2daf48953ab7bf4084009" Feb 20 17:40:59 crc kubenswrapper[4697]: E0220 17:40:59.342315 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b00872eba5ec7fc54dc3f5cd016736547a0a279822b2daf48953ab7bf4084009\": container with ID starting with b00872eba5ec7fc54dc3f5cd016736547a0a279822b2daf48953ab7bf4084009 not found: ID does not exist" containerID="b00872eba5ec7fc54dc3f5cd016736547a0a279822b2daf48953ab7bf4084009" Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.342351 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b00872eba5ec7fc54dc3f5cd016736547a0a279822b2daf48953ab7bf4084009"} err="failed to get container status \"b00872eba5ec7fc54dc3f5cd016736547a0a279822b2daf48953ab7bf4084009\": rpc error: code = NotFound desc = could not find container \"b00872eba5ec7fc54dc3f5cd016736547a0a279822b2daf48953ab7bf4084009\": container with ID starting with b00872eba5ec7fc54dc3f5cd016736547a0a279822b2daf48953ab7bf4084009 not found: ID does not exist" Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.342378 4697 scope.go:117] "RemoveContainer" containerID="881a8eb12bffb9a73888ff9f0568b3051aa0f7695c0b1af0f79ae303704b0321" Feb 20 17:40:59 crc kubenswrapper[4697]: E0220 17:40:59.342673 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"881a8eb12bffb9a73888ff9f0568b3051aa0f7695c0b1af0f79ae303704b0321\": container with ID starting with 881a8eb12bffb9a73888ff9f0568b3051aa0f7695c0b1af0f79ae303704b0321 not found: ID does not exist" containerID="881a8eb12bffb9a73888ff9f0568b3051aa0f7695c0b1af0f79ae303704b0321" Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.342774 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"881a8eb12bffb9a73888ff9f0568b3051aa0f7695c0b1af0f79ae303704b0321"} err="failed to get container status \"881a8eb12bffb9a73888ff9f0568b3051aa0f7695c0b1af0f79ae303704b0321\": rpc error: code = NotFound desc = could not find container \"881a8eb12bffb9a73888ff9f0568b3051aa0f7695c0b1af0f79ae303704b0321\": container with ID starting with 881a8eb12bffb9a73888ff9f0568b3051aa0f7695c0b1af0f79ae303704b0321 not found: ID does not exist" Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.486923 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jcfjz"] Feb 20 17:40:59 crc kubenswrapper[4697]: I0220 17:40:59.487347 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jcfjz" podUID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" containerName="registry-server" containerID="cri-o://efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0" gracePeriod=2 Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.005550 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.045031 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-catalog-content\") pod \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\" (UID: \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\") " Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.045213 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg2t5\" (UniqueName: \"kubernetes.io/projected/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-kube-api-access-gg2t5\") pod \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\" (UID: \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\") " Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.045236 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-utilities\") pod \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\" (UID: \"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16\") " Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.046377 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-utilities" (OuterVolumeSpecName: "utilities") pod "5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" (UID: "5b45ac9f-02b6-4acf-b2a2-45ccf5324b16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.052173 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-kube-api-access-gg2t5" (OuterVolumeSpecName: "kube-api-access-gg2t5") pod "5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" (UID: "5b45ac9f-02b6-4acf-b2a2-45ccf5324b16"). InnerVolumeSpecName "kube-api-access-gg2t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.110146 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" (UID: "5b45ac9f-02b6-4acf-b2a2-45ccf5324b16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.147789 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg2t5\" (UniqueName: \"kubernetes.io/projected/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-kube-api-access-gg2t5\") on node \"crc\" DevicePath \"\"" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.148069 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.148128 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.228792 4697 generic.go:334] "Generic (PLEG): container finished" podID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" containerID="efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0" exitCode=0 Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.228845 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jcfjz" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.228864 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcfjz" event={"ID":"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16","Type":"ContainerDied","Data":"efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0"} Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.230085 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcfjz" event={"ID":"5b45ac9f-02b6-4acf-b2a2-45ccf5324b16","Type":"ContainerDied","Data":"4bc5f45ce39583c95a8af1726be257b38ff0bd0adf6d5015ea01571c66d8a723"} Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.230123 4697 scope.go:117] "RemoveContainer" containerID="efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.252510 4697 scope.go:117] "RemoveContainer" containerID="72860e66ead3d9aac935292d1028ee24608530df620e654b580b777fd1772b82" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.270321 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jcfjz"] Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.280116 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jcfjz"] Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.291313 4697 scope.go:117] "RemoveContainer" containerID="e7faa31a3f25b86aec7f9e4a4976af0243c9cc578cfbdebe6bfe7a22c8cb0c8b" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.313178 4697 scope.go:117] "RemoveContainer" containerID="efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0" Feb 20 17:41:00 crc kubenswrapper[4697]: E0220 17:41:00.313644 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0\": container with ID starting with efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0 not found: ID does not exist" containerID="efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.313684 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0"} err="failed to get container status \"efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0\": rpc error: code = NotFound desc = could not find container \"efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0\": container with ID starting with efbc6bd9ec7798749a7c785f54223a9a1eab71f59d96665608281c5d7c50e0f0 not found: ID does not exist" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.313711 4697 scope.go:117] "RemoveContainer" containerID="72860e66ead3d9aac935292d1028ee24608530df620e654b580b777fd1772b82" Feb 20 17:41:00 crc kubenswrapper[4697]: E0220 17:41:00.314390 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72860e66ead3d9aac935292d1028ee24608530df620e654b580b777fd1772b82\": container with ID starting with 72860e66ead3d9aac935292d1028ee24608530df620e654b580b777fd1772b82 not found: ID does not exist" containerID="72860e66ead3d9aac935292d1028ee24608530df620e654b580b777fd1772b82" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.314414 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72860e66ead3d9aac935292d1028ee24608530df620e654b580b777fd1772b82"} err="failed to get container status \"72860e66ead3d9aac935292d1028ee24608530df620e654b580b777fd1772b82\": rpc error: code = NotFound desc = could not find container \"72860e66ead3d9aac935292d1028ee24608530df620e654b580b777fd1772b82\": container with ID starting with 72860e66ead3d9aac935292d1028ee24608530df620e654b580b777fd1772b82 not found: ID does not exist" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.314428 4697 scope.go:117] "RemoveContainer" containerID="e7faa31a3f25b86aec7f9e4a4976af0243c9cc578cfbdebe6bfe7a22c8cb0c8b" Feb 20 17:41:00 crc kubenswrapper[4697]: E0220 17:41:00.314832 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7faa31a3f25b86aec7f9e4a4976af0243c9cc578cfbdebe6bfe7a22c8cb0c8b\": container with ID starting with e7faa31a3f25b86aec7f9e4a4976af0243c9cc578cfbdebe6bfe7a22c8cb0c8b not found: ID does not exist" containerID="e7faa31a3f25b86aec7f9e4a4976af0243c9cc578cfbdebe6bfe7a22c8cb0c8b" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.314863 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7faa31a3f25b86aec7f9e4a4976af0243c9cc578cfbdebe6bfe7a22c8cb0c8b"} err="failed to get container status \"e7faa31a3f25b86aec7f9e4a4976af0243c9cc578cfbdebe6bfe7a22c8cb0c8b\": rpc error: code = NotFound desc = could not find container \"e7faa31a3f25b86aec7f9e4a4976af0243c9cc578cfbdebe6bfe7a22c8cb0c8b\": container with ID starting with e7faa31a3f25b86aec7f9e4a4976af0243c9cc578cfbdebe6bfe7a22c8cb0c8b not found: ID does not exist" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.888777 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4210e50a-a717-4c84-a1b7-6b50628cdc50" path="/var/lib/kubelet/pods/4210e50a-a717-4c84-a1b7-6b50628cdc50/volumes" Feb 20 17:41:00 crc kubenswrapper[4697]: I0220 17:41:00.890386 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" path="/var/lib/kubelet/pods/5b45ac9f-02b6-4acf-b2a2-45ccf5324b16/volumes" Feb 20 17:41:04 crc kubenswrapper[4697]: I0220 17:41:04.577220 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:41:04 crc kubenswrapper[4697]: I0220 17:41:04.623051 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:41:05 crc kubenswrapper[4697]: I0220 17:41:05.693304 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l5cdg"] Feb 20 17:41:06 crc kubenswrapper[4697]: I0220 17:41:06.297562 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l5cdg" podUID="50352784-9415-4277-9122-aca9457a8363" containerName="registry-server" containerID="cri-o://01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7" gracePeriod=2 Feb 20 17:41:06 crc kubenswrapper[4697]: I0220 17:41:06.831376 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:41:06 crc kubenswrapper[4697]: I0220 17:41:06.923510 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50352784-9415-4277-9122-aca9457a8363-utilities\") pod \"50352784-9415-4277-9122-aca9457a8363\" (UID: \"50352784-9415-4277-9122-aca9457a8363\") " Feb 20 17:41:06 crc kubenswrapper[4697]: I0220 17:41:06.923950 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7mnk\" (UniqueName: \"kubernetes.io/projected/50352784-9415-4277-9122-aca9457a8363-kube-api-access-b7mnk\") pod \"50352784-9415-4277-9122-aca9457a8363\" (UID: \"50352784-9415-4277-9122-aca9457a8363\") " Feb 20 17:41:06 crc kubenswrapper[4697]: I0220 17:41:06.923995 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50352784-9415-4277-9122-aca9457a8363-catalog-content\") pod \"50352784-9415-4277-9122-aca9457a8363\" (UID: \"50352784-9415-4277-9122-aca9457a8363\") " Feb 20 17:41:06 crc kubenswrapper[4697]: I0220 17:41:06.924703 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50352784-9415-4277-9122-aca9457a8363-utilities" (OuterVolumeSpecName: "utilities") pod "50352784-9415-4277-9122-aca9457a8363" (UID: "50352784-9415-4277-9122-aca9457a8363"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:41:06 crc kubenswrapper[4697]: I0220 17:41:06.925147 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50352784-9415-4277-9122-aca9457a8363-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:41:06 crc kubenswrapper[4697]: I0220 17:41:06.944276 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50352784-9415-4277-9122-aca9457a8363-kube-api-access-b7mnk" (OuterVolumeSpecName: "kube-api-access-b7mnk") pod "50352784-9415-4277-9122-aca9457a8363" (UID: "50352784-9415-4277-9122-aca9457a8363"). InnerVolumeSpecName "kube-api-access-b7mnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.027670 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7mnk\" (UniqueName: \"kubernetes.io/projected/50352784-9415-4277-9122-aca9457a8363-kube-api-access-b7mnk\") on node \"crc\" DevicePath \"\"" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.044811 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50352784-9415-4277-9122-aca9457a8363-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50352784-9415-4277-9122-aca9457a8363" (UID: "50352784-9415-4277-9122-aca9457a8363"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.129676 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50352784-9415-4277-9122-aca9457a8363-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.320694 4697 generic.go:334] "Generic (PLEG): container finished" podID="50352784-9415-4277-9122-aca9457a8363" containerID="01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7" exitCode=0 Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.320735 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5cdg" event={"ID":"50352784-9415-4277-9122-aca9457a8363","Type":"ContainerDied","Data":"01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7"} Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.320762 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5cdg" event={"ID":"50352784-9415-4277-9122-aca9457a8363","Type":"ContainerDied","Data":"f0bf20b9be1c9b13c53a523f2a16df5645ad764480b67d9fd7c862b46f72ea06"} Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.320780 4697 scope.go:117] "RemoveContainer" containerID="01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.320801 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5cdg" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.344967 4697 scope.go:117] "RemoveContainer" containerID="0fd9d5e16958fac47148434e7849e601b207e6a9c22f2a942d644cc251387bb4" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.365898 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l5cdg"] Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.406823 4697 scope.go:117] "RemoveContainer" containerID="7149a3312c69bb93fc8aed418b693787b7c15781b645cc93fb6d0f13400458cb" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.423644 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l5cdg"] Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.463470 4697 scope.go:117] "RemoveContainer" containerID="01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7" Feb 20 17:41:07 crc kubenswrapper[4697]: E0220 17:41:07.464870 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7\": container with ID starting with 01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7 not found: ID does not exist" containerID="01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.464909 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7"} err="failed to get container status \"01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7\": rpc error: code = NotFound desc = could not find container \"01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7\": container with ID starting with 01bb6fa16c2ff42cfc0fef0e1861c5f89bc5fd418163f4a43da8c124980d68e7 not found: ID does not exist" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.464936 4697 scope.go:117] "RemoveContainer" containerID="0fd9d5e16958fac47148434e7849e601b207e6a9c22f2a942d644cc251387bb4" Feb 20 17:41:07 crc kubenswrapper[4697]: E0220 17:41:07.465459 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fd9d5e16958fac47148434e7849e601b207e6a9c22f2a942d644cc251387bb4\": container with ID starting with 0fd9d5e16958fac47148434e7849e601b207e6a9c22f2a942d644cc251387bb4 not found: ID does not exist" containerID="0fd9d5e16958fac47148434e7849e601b207e6a9c22f2a942d644cc251387bb4" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.465493 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fd9d5e16958fac47148434e7849e601b207e6a9c22f2a942d644cc251387bb4"} err="failed to get container status \"0fd9d5e16958fac47148434e7849e601b207e6a9c22f2a942d644cc251387bb4\": rpc error: code = NotFound desc = could not find container \"0fd9d5e16958fac47148434e7849e601b207e6a9c22f2a942d644cc251387bb4\": container with ID starting with 0fd9d5e16958fac47148434e7849e601b207e6a9c22f2a942d644cc251387bb4 not found: ID does not exist" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.465512 4697 scope.go:117] "RemoveContainer" containerID="7149a3312c69bb93fc8aed418b693787b7c15781b645cc93fb6d0f13400458cb" Feb 20 17:41:07 crc kubenswrapper[4697]: E0220 17:41:07.465824 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7149a3312c69bb93fc8aed418b693787b7c15781b645cc93fb6d0f13400458cb\": container with ID starting with 7149a3312c69bb93fc8aed418b693787b7c15781b645cc93fb6d0f13400458cb not found: ID does not exist" containerID="7149a3312c69bb93fc8aed418b693787b7c15781b645cc93fb6d0f13400458cb" Feb 20 17:41:07 crc kubenswrapper[4697]: I0220 17:41:07.465854 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7149a3312c69bb93fc8aed418b693787b7c15781b645cc93fb6d0f13400458cb"} err="failed to get container status \"7149a3312c69bb93fc8aed418b693787b7c15781b645cc93fb6d0f13400458cb\": rpc error: code = NotFound desc = could not find container \"7149a3312c69bb93fc8aed418b693787b7c15781b645cc93fb6d0f13400458cb\": container with ID starting with 7149a3312c69bb93fc8aed418b693787b7c15781b645cc93fb6d0f13400458cb not found: ID does not exist" Feb 20 17:41:08 crc kubenswrapper[4697]: I0220 17:41:08.888338 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50352784-9415-4277-9122-aca9457a8363" path="/var/lib/kubelet/pods/50352784-9415-4277-9122-aca9457a8363/volumes" Feb 20 17:41:31 crc kubenswrapper[4697]: I0220 17:41:31.185079 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:41:31 crc kubenswrapper[4697]: I0220 17:41:31.185763 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:42:01 crc kubenswrapper[4697]: I0220 17:42:01.185569 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:42:01 crc kubenswrapper[4697]: I0220 17:42:01.186213 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:42:31 crc kubenswrapper[4697]: I0220 17:42:31.184954 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:42:31 crc kubenswrapper[4697]: I0220 17:42:31.185565 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:42:31 crc kubenswrapper[4697]: I0220 17:42:31.185618 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 17:42:31 crc kubenswrapper[4697]: I0220 17:42:31.186444 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 17:42:31 crc kubenswrapper[4697]: I0220 17:42:31.186498 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" gracePeriod=600 Feb 20 17:42:31 crc kubenswrapper[4697]: E0220 17:42:31.515722 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:42:32 crc kubenswrapper[4697]: I0220 17:42:32.177457 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" exitCode=0 Feb 20 17:42:32 crc kubenswrapper[4697]: I0220 17:42:32.177507 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b"} Feb 20 17:42:32 crc kubenswrapper[4697]: I0220 17:42:32.177544 4697 scope.go:117] "RemoveContainer" containerID="06d16abf023e436e56d3f9256ee36737fec7b615b838934ac0a5478653ced746" Feb 20 17:42:32 crc kubenswrapper[4697]: I0220 17:42:32.178345 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:42:32 crc kubenswrapper[4697]: E0220 17:42:32.178690 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:42:44 crc kubenswrapper[4697]: I0220 17:42:44.877322 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:42:44 crc kubenswrapper[4697]: E0220 17:42:44.880087 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:42:59 crc kubenswrapper[4697]: I0220 17:42:59.877175 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:42:59 crc kubenswrapper[4697]: E0220 17:42:59.878119 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:43:11 crc kubenswrapper[4697]: I0220 17:43:11.855657 4697 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-8wmf5 container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.36:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 17:43:11 crc kubenswrapper[4697]: I0220 17:43:11.856183 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-8wmf5" podUID="12db7fd9-ce39-43cf-99b7-3a56791c0390" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.36:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 17:43:11 crc kubenswrapper[4697]: I0220 17:43:11.969226 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:43:11 crc kubenswrapper[4697]: E0220 17:43:11.969739 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:43:24 crc kubenswrapper[4697]: I0220 17:43:24.877807 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:43:24 crc kubenswrapper[4697]: E0220 17:43:24.878851 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:43:37 crc kubenswrapper[4697]: I0220 17:43:37.877375 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:43:37 crc kubenswrapper[4697]: E0220 17:43:37.878242 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:43:50 crc kubenswrapper[4697]: I0220 17:43:50.877387 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:43:50 crc kubenswrapper[4697]: E0220 17:43:50.878268 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:44:03 crc kubenswrapper[4697]: I0220 17:44:03.877508 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:44:03 crc kubenswrapper[4697]: E0220 17:44:03.878128 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.453474 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d97rd"] Feb 20 17:44:17 crc kubenswrapper[4697]: E0220 17:44:17.454449 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4210e50a-a717-4c84-a1b7-6b50628cdc50" containerName="extract-utilities" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.454470 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="4210e50a-a717-4c84-a1b7-6b50628cdc50" containerName="extract-utilities" Feb 20 17:44:17 crc kubenswrapper[4697]: E0220 17:44:17.454487 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50352784-9415-4277-9122-aca9457a8363" containerName="extract-utilities" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.454493 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="50352784-9415-4277-9122-aca9457a8363" containerName="extract-utilities" Feb 20 17:44:17 crc kubenswrapper[4697]: E0220 17:44:17.454507 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50352784-9415-4277-9122-aca9457a8363" containerName="registry-server" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.454513 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="50352784-9415-4277-9122-aca9457a8363" containerName="registry-server" Feb 20 17:44:17 crc kubenswrapper[4697]: E0220 17:44:17.454529 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4210e50a-a717-4c84-a1b7-6b50628cdc50" containerName="registry-server" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.454535 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="4210e50a-a717-4c84-a1b7-6b50628cdc50" containerName="registry-server" Feb 20 17:44:17 crc kubenswrapper[4697]: E0220 17:44:17.454551 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" containerName="extract-utilities" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.454557 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" containerName="extract-utilities" Feb 20 17:44:17 crc kubenswrapper[4697]: E0220 17:44:17.454568 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4210e50a-a717-4c84-a1b7-6b50628cdc50" containerName="extract-content" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.454574 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="4210e50a-a717-4c84-a1b7-6b50628cdc50" containerName="extract-content" Feb 20 17:44:17 crc kubenswrapper[4697]: E0220 17:44:17.454586 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50352784-9415-4277-9122-aca9457a8363" containerName="extract-content" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.454592 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="50352784-9415-4277-9122-aca9457a8363" containerName="extract-content" Feb 20 17:44:17 crc kubenswrapper[4697]: E0220 17:44:17.454610 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" containerName="registry-server" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.454617 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" containerName="registry-server" Feb 20 17:44:17 crc kubenswrapper[4697]: E0220 17:44:17.454627 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" containerName="extract-content" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.454632 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" containerName="extract-content" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.454823 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="4210e50a-a717-4c84-a1b7-6b50628cdc50" containerName="registry-server" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.454850 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b45ac9f-02b6-4acf-b2a2-45ccf5324b16" containerName="registry-server" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.454860 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="50352784-9415-4277-9122-aca9457a8363" containerName="registry-server" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.456477 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.462930 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d97rd"] Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.619957 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37d62df-2778-4859-8b73-284085936bb5-catalog-content\") pod \"redhat-marketplace-d97rd\" (UID: \"b37d62df-2778-4859-8b73-284085936bb5\") " pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.621546 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn5kd\" (UniqueName: \"kubernetes.io/projected/b37d62df-2778-4859-8b73-284085936bb5-kube-api-access-cn5kd\") pod \"redhat-marketplace-d97rd\" (UID: \"b37d62df-2778-4859-8b73-284085936bb5\") " pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.621629 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37d62df-2778-4859-8b73-284085936bb5-utilities\") pod \"redhat-marketplace-d97rd\" (UID: \"b37d62df-2778-4859-8b73-284085936bb5\") " pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.723663 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn5kd\" (UniqueName: \"kubernetes.io/projected/b37d62df-2778-4859-8b73-284085936bb5-kube-api-access-cn5kd\") pod \"redhat-marketplace-d97rd\" (UID: \"b37d62df-2778-4859-8b73-284085936bb5\") " pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.723741 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37d62df-2778-4859-8b73-284085936bb5-utilities\") pod \"redhat-marketplace-d97rd\" (UID: \"b37d62df-2778-4859-8b73-284085936bb5\") " pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.723816 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37d62df-2778-4859-8b73-284085936bb5-catalog-content\") pod \"redhat-marketplace-d97rd\" (UID: \"b37d62df-2778-4859-8b73-284085936bb5\") " pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.724680 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37d62df-2778-4859-8b73-284085936bb5-catalog-content\") pod \"redhat-marketplace-d97rd\" (UID: \"b37d62df-2778-4859-8b73-284085936bb5\") " pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.725539 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37d62df-2778-4859-8b73-284085936bb5-utilities\") pod \"redhat-marketplace-d97rd\" (UID: \"b37d62df-2778-4859-8b73-284085936bb5\") " pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.745810 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn5kd\" (UniqueName: \"kubernetes.io/projected/b37d62df-2778-4859-8b73-284085936bb5-kube-api-access-cn5kd\") pod \"redhat-marketplace-d97rd\" (UID: \"b37d62df-2778-4859-8b73-284085936bb5\") " pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:17 crc kubenswrapper[4697]: I0220 17:44:17.782182 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:18 crc kubenswrapper[4697]: I0220 17:44:18.256656 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d97rd"] Feb 20 17:44:18 crc kubenswrapper[4697]: I0220 17:44:18.701247 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d97rd" event={"ID":"b37d62df-2778-4859-8b73-284085936bb5","Type":"ContainerStarted","Data":"1c0ee11591491b59bf6da21ec38f47dc5a724b542ab6b140eff9155fb1216df4"} Feb 20 17:44:18 crc kubenswrapper[4697]: I0220 17:44:18.878610 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:44:18 crc kubenswrapper[4697]: E0220 17:44:18.878961 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:44:19 crc kubenswrapper[4697]: I0220 17:44:19.712358 4697 generic.go:334] "Generic (PLEG): container finished" podID="b37d62df-2778-4859-8b73-284085936bb5" containerID="a62e0ebb33e511ba52a44e4154c99b699ce7f8cbabb13be74f83b2d6583ec2d8" exitCode=0 Feb 20 17:44:19 crc kubenswrapper[4697]: I0220 17:44:19.712493 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d97rd" event={"ID":"b37d62df-2778-4859-8b73-284085936bb5","Type":"ContainerDied","Data":"a62e0ebb33e511ba52a44e4154c99b699ce7f8cbabb13be74f83b2d6583ec2d8"} Feb 20 17:44:20 crc kubenswrapper[4697]: I0220 17:44:20.725799 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d97rd" event={"ID":"b37d62df-2778-4859-8b73-284085936bb5","Type":"ContainerStarted","Data":"51760f4f5750c47afef7248df3547c28f84e5d2e501fa87fe744cde44a922545"} Feb 20 17:44:21 crc kubenswrapper[4697]: I0220 17:44:21.738306 4697 generic.go:334] "Generic (PLEG): container finished" podID="b37d62df-2778-4859-8b73-284085936bb5" containerID="51760f4f5750c47afef7248df3547c28f84e5d2e501fa87fe744cde44a922545" exitCode=0 Feb 20 17:44:21 crc kubenswrapper[4697]: I0220 17:44:21.738367 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d97rd" event={"ID":"b37d62df-2778-4859-8b73-284085936bb5","Type":"ContainerDied","Data":"51760f4f5750c47afef7248df3547c28f84e5d2e501fa87fe744cde44a922545"} Feb 20 17:44:22 crc kubenswrapper[4697]: I0220 17:44:22.750145 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d97rd" event={"ID":"b37d62df-2778-4859-8b73-284085936bb5","Type":"ContainerStarted","Data":"311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd"} Feb 20 17:44:22 crc kubenswrapper[4697]: I0220 17:44:22.772814 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d97rd" podStartSLOduration=3.255354875 podStartE2EDuration="5.772795478s" podCreationTimestamp="2026-02-20 17:44:17 +0000 UTC" firstStartedPulling="2026-02-20 17:44:19.715689072 +0000 UTC m=+4367.495734480" lastFinishedPulling="2026-02-20 17:44:22.233129665 +0000 UTC m=+4370.013175083" observedRunningTime="2026-02-20 17:44:22.772104821 +0000 UTC m=+4370.552150269" watchObservedRunningTime="2026-02-20 17:44:22.772795478 +0000 UTC m=+4370.552840896" Feb 20 17:44:27 crc kubenswrapper[4697]: I0220 17:44:27.783391 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:27 crc kubenswrapper[4697]: I0220 17:44:27.783860 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:27 crc kubenswrapper[4697]: I0220 17:44:27.835204 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:27 crc kubenswrapper[4697]: I0220 17:44:27.909697 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:28 crc kubenswrapper[4697]: I0220 17:44:28.076978 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d97rd"] Feb 20 17:44:29 crc kubenswrapper[4697]: I0220 17:44:29.820745 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d97rd" podUID="b37d62df-2778-4859-8b73-284085936bb5" containerName="registry-server" containerID="cri-o://311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd" gracePeriod=2 Feb 20 17:44:29 crc kubenswrapper[4697]: I0220 17:44:29.878019 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:44:29 crc kubenswrapper[4697]: E0220 17:44:29.878396 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.328269 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.482775 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37d62df-2778-4859-8b73-284085936bb5-utilities\") pod \"b37d62df-2778-4859-8b73-284085936bb5\" (UID: \"b37d62df-2778-4859-8b73-284085936bb5\") " Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.482987 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37d62df-2778-4859-8b73-284085936bb5-catalog-content\") pod \"b37d62df-2778-4859-8b73-284085936bb5\" (UID: \"b37d62df-2778-4859-8b73-284085936bb5\") " Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.483145 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn5kd\" (UniqueName: \"kubernetes.io/projected/b37d62df-2778-4859-8b73-284085936bb5-kube-api-access-cn5kd\") pod \"b37d62df-2778-4859-8b73-284085936bb5\" (UID: \"b37d62df-2778-4859-8b73-284085936bb5\") " Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.483737 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b37d62df-2778-4859-8b73-284085936bb5-utilities" (OuterVolumeSpecName: "utilities") pod "b37d62df-2778-4859-8b73-284085936bb5" (UID: "b37d62df-2778-4859-8b73-284085936bb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.488340 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b37d62df-2778-4859-8b73-284085936bb5-kube-api-access-cn5kd" (OuterVolumeSpecName: "kube-api-access-cn5kd") pod "b37d62df-2778-4859-8b73-284085936bb5" (UID: "b37d62df-2778-4859-8b73-284085936bb5"). InnerVolumeSpecName "kube-api-access-cn5kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.517085 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b37d62df-2778-4859-8b73-284085936bb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b37d62df-2778-4859-8b73-284085936bb5" (UID: "b37d62df-2778-4859-8b73-284085936bb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.586125 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37d62df-2778-4859-8b73-284085936bb5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.586173 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn5kd\" (UniqueName: \"kubernetes.io/projected/b37d62df-2778-4859-8b73-284085936bb5-kube-api-access-cn5kd\") on node \"crc\" DevicePath \"\"" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.586187 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37d62df-2778-4859-8b73-284085936bb5-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.841534 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d97rd" event={"ID":"b37d62df-2778-4859-8b73-284085936bb5","Type":"ContainerDied","Data":"311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd"} Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.841595 4697 generic.go:334] "Generic (PLEG): container finished" podID="b37d62df-2778-4859-8b73-284085936bb5" containerID="311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd" exitCode=0 Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.841781 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d97rd" event={"ID":"b37d62df-2778-4859-8b73-284085936bb5","Type":"ContainerDied","Data":"1c0ee11591491b59bf6da21ec38f47dc5a724b542ab6b140eff9155fb1216df4"} Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.841685 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d97rd" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.841662 4697 scope.go:117] "RemoveContainer" containerID="311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.893756 4697 scope.go:117] "RemoveContainer" containerID="51760f4f5750c47afef7248df3547c28f84e5d2e501fa87fe744cde44a922545" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.896184 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d97rd"] Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.896228 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d97rd"] Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.931776 4697 scope.go:117] "RemoveContainer" containerID="a62e0ebb33e511ba52a44e4154c99b699ce7f8cbabb13be74f83b2d6583ec2d8" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.983807 4697 scope.go:117] "RemoveContainer" containerID="311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd" Feb 20 17:44:30 crc kubenswrapper[4697]: E0220 17:44:30.984574 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd\": container with ID starting with 311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd not found: ID does not exist" containerID="311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.984684 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd"} err="failed to get container status \"311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd\": rpc error: code = NotFound desc = could not find container \"311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd\": container with ID starting with 311d8f5b3a58f8e34df8af9fb63a15a0142a337331f6ceb7995dbda7c1d569dd not found: ID does not exist" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.984792 4697 scope.go:117] "RemoveContainer" containerID="51760f4f5750c47afef7248df3547c28f84e5d2e501fa87fe744cde44a922545" Feb 20 17:44:30 crc kubenswrapper[4697]: E0220 17:44:30.985575 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51760f4f5750c47afef7248df3547c28f84e5d2e501fa87fe744cde44a922545\": container with ID starting with 51760f4f5750c47afef7248df3547c28f84e5d2e501fa87fe744cde44a922545 not found: ID does not exist" containerID="51760f4f5750c47afef7248df3547c28f84e5d2e501fa87fe744cde44a922545" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.985656 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51760f4f5750c47afef7248df3547c28f84e5d2e501fa87fe744cde44a922545"} err="failed to get container status \"51760f4f5750c47afef7248df3547c28f84e5d2e501fa87fe744cde44a922545\": rpc error: code = NotFound desc = could not find container \"51760f4f5750c47afef7248df3547c28f84e5d2e501fa87fe744cde44a922545\": container with ID starting with 51760f4f5750c47afef7248df3547c28f84e5d2e501fa87fe744cde44a922545 not found: ID does not exist" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.985716 4697 scope.go:117] "RemoveContainer" containerID="a62e0ebb33e511ba52a44e4154c99b699ce7f8cbabb13be74f83b2d6583ec2d8" Feb 20 17:44:30 crc kubenswrapper[4697]: E0220 17:44:30.986359 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a62e0ebb33e511ba52a44e4154c99b699ce7f8cbabb13be74f83b2d6583ec2d8\": container with ID starting with a62e0ebb33e511ba52a44e4154c99b699ce7f8cbabb13be74f83b2d6583ec2d8 not found: ID does not exist" containerID="a62e0ebb33e511ba52a44e4154c99b699ce7f8cbabb13be74f83b2d6583ec2d8" Feb 20 17:44:30 crc kubenswrapper[4697]: I0220 17:44:30.986411 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a62e0ebb33e511ba52a44e4154c99b699ce7f8cbabb13be74f83b2d6583ec2d8"} err="failed to get container status \"a62e0ebb33e511ba52a44e4154c99b699ce7f8cbabb13be74f83b2d6583ec2d8\": rpc error: code = NotFound desc = could not find container \"a62e0ebb33e511ba52a44e4154c99b699ce7f8cbabb13be74f83b2d6583ec2d8\": container with ID starting with a62e0ebb33e511ba52a44e4154c99b699ce7f8cbabb13be74f83b2d6583ec2d8 not found: ID does not exist" Feb 20 17:44:32 crc kubenswrapper[4697]: I0220 17:44:32.890591 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b37d62df-2778-4859-8b73-284085936bb5" path="/var/lib/kubelet/pods/b37d62df-2778-4859-8b73-284085936bb5/volumes" Feb 20 17:44:43 crc kubenswrapper[4697]: I0220 17:44:43.876852 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:44:43 crc kubenswrapper[4697]: E0220 17:44:43.877642 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:44:55 crc kubenswrapper[4697]: I0220 17:44:55.877185 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:44:55 crc kubenswrapper[4697]: E0220 17:44:55.878085 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.204770 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4"] Feb 20 17:45:00 crc kubenswrapper[4697]: E0220 17:45:00.205937 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37d62df-2778-4859-8b73-284085936bb5" containerName="extract-content" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.205957 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37d62df-2778-4859-8b73-284085936bb5" containerName="extract-content" Feb 20 17:45:00 crc kubenswrapper[4697]: E0220 17:45:00.205979 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37d62df-2778-4859-8b73-284085936bb5" containerName="registry-server" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.205989 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37d62df-2778-4859-8b73-284085936bb5" containerName="registry-server" Feb 20 17:45:00 crc kubenswrapper[4697]: E0220 17:45:00.206018 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37d62df-2778-4859-8b73-284085936bb5" containerName="extract-utilities" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.206030 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37d62df-2778-4859-8b73-284085936bb5" containerName="extract-utilities" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.206331 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37d62df-2778-4859-8b73-284085936bb5" containerName="registry-server" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.207365 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.212690 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.212907 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.218278 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4"] Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.290280 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw2nd\" (UniqueName: \"kubernetes.io/projected/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-kube-api-access-kw2nd\") pod \"collect-profiles-29526825-wv7t4\" (UID: \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.290516 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-config-volume\") pod \"collect-profiles-29526825-wv7t4\" (UID: \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.290553 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-secret-volume\") pod \"collect-profiles-29526825-wv7t4\" (UID: \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.391390 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-config-volume\") pod \"collect-profiles-29526825-wv7t4\" (UID: \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.391489 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-secret-volume\") pod \"collect-profiles-29526825-wv7t4\" (UID: \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.391583 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw2nd\" (UniqueName: \"kubernetes.io/projected/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-kube-api-access-kw2nd\") pod \"collect-profiles-29526825-wv7t4\" (UID: \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.392630 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-config-volume\") pod \"collect-profiles-29526825-wv7t4\" (UID: \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.398045 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-secret-volume\") pod \"collect-profiles-29526825-wv7t4\" (UID: \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.406679 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw2nd\" (UniqueName: \"kubernetes.io/projected/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-kube-api-access-kw2nd\") pod \"collect-profiles-29526825-wv7t4\" (UID: \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:00 crc kubenswrapper[4697]: I0220 17:45:00.530826 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:01 crc kubenswrapper[4697]: I0220 17:45:01.852794 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4"] Feb 20 17:45:02 crc kubenswrapper[4697]: I0220 17:45:02.168983 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" event={"ID":"9def3e65-38cf-4ae4-b9ed-0f3446b589eb","Type":"ContainerStarted","Data":"3111c2c5ec97d4e6369baec6220954ef11987499af1f969e2b09c727dd0b4b2a"} Feb 20 17:45:02 crc kubenswrapper[4697]: I0220 17:45:02.169344 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" event={"ID":"9def3e65-38cf-4ae4-b9ed-0f3446b589eb","Type":"ContainerStarted","Data":"ca5879267f72185d582eb58e08e5352502e8a34a8110f009ba4bd722a8210743"} Feb 20 17:45:02 crc kubenswrapper[4697]: I0220 17:45:02.186218 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" podStartSLOduration=2.186202815 podStartE2EDuration="2.186202815s" podCreationTimestamp="2026-02-20 17:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 17:45:02.183866909 +0000 UTC m=+4409.963912317" watchObservedRunningTime="2026-02-20 17:45:02.186202815 +0000 UTC m=+4409.966248223" Feb 20 17:45:03 crc kubenswrapper[4697]: I0220 17:45:03.180623 4697 generic.go:334] "Generic (PLEG): container finished" podID="9def3e65-38cf-4ae4-b9ed-0f3446b589eb" containerID="3111c2c5ec97d4e6369baec6220954ef11987499af1f969e2b09c727dd0b4b2a" exitCode=0 Feb 20 17:45:03 crc kubenswrapper[4697]: I0220 17:45:03.180686 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" event={"ID":"9def3e65-38cf-4ae4-b9ed-0f3446b589eb","Type":"ContainerDied","Data":"3111c2c5ec97d4e6369baec6220954ef11987499af1f969e2b09c727dd0b4b2a"} Feb 20 17:45:04 crc kubenswrapper[4697]: I0220 17:45:04.558869 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:04 crc kubenswrapper[4697]: I0220 17:45:04.745641 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw2nd\" (UniqueName: \"kubernetes.io/projected/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-kube-api-access-kw2nd\") pod \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\" (UID: \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\") " Feb 20 17:45:04 crc kubenswrapper[4697]: I0220 17:45:04.745808 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-secret-volume\") pod \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\" (UID: \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\") " Feb 20 17:45:04 crc kubenswrapper[4697]: I0220 17:45:04.745834 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-config-volume\") pod \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\" (UID: \"9def3e65-38cf-4ae4-b9ed-0f3446b589eb\") " Feb 20 17:45:04 crc kubenswrapper[4697]: I0220 17:45:04.746599 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-config-volume" (OuterVolumeSpecName: "config-volume") pod "9def3e65-38cf-4ae4-b9ed-0f3446b589eb" (UID: "9def3e65-38cf-4ae4-b9ed-0f3446b589eb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 17:45:04 crc kubenswrapper[4697]: I0220 17:45:04.751989 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9def3e65-38cf-4ae4-b9ed-0f3446b589eb" (UID: "9def3e65-38cf-4ae4-b9ed-0f3446b589eb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:45:04 crc kubenswrapper[4697]: I0220 17:45:04.753558 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-kube-api-access-kw2nd" (OuterVolumeSpecName: "kube-api-access-kw2nd") pod "9def3e65-38cf-4ae4-b9ed-0f3446b589eb" (UID: "9def3e65-38cf-4ae4-b9ed-0f3446b589eb"). InnerVolumeSpecName "kube-api-access-kw2nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:45:04 crc kubenswrapper[4697]: I0220 17:45:04.848275 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw2nd\" (UniqueName: \"kubernetes.io/projected/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-kube-api-access-kw2nd\") on node \"crc\" DevicePath \"\"" Feb 20 17:45:04 crc kubenswrapper[4697]: I0220 17:45:04.848315 4697 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 17:45:04 crc kubenswrapper[4697]: I0220 17:45:04.848325 4697 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9def3e65-38cf-4ae4-b9ed-0f3446b589eb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 17:45:04 crc kubenswrapper[4697]: I0220 17:45:04.928319 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm"] Feb 20 17:45:04 crc kubenswrapper[4697]: I0220 17:45:04.940688 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526780-zbftm"] Feb 20 17:45:05 crc kubenswrapper[4697]: I0220 17:45:05.198705 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" event={"ID":"9def3e65-38cf-4ae4-b9ed-0f3446b589eb","Type":"ContainerDied","Data":"ca5879267f72185d582eb58e08e5352502e8a34a8110f009ba4bd722a8210743"} Feb 20 17:45:05 crc kubenswrapper[4697]: I0220 17:45:05.198742 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca5879267f72185d582eb58e08e5352502e8a34a8110f009ba4bd722a8210743" Feb 20 17:45:05 crc kubenswrapper[4697]: I0220 17:45:05.198794 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526825-wv7t4" Feb 20 17:45:06 crc kubenswrapper[4697]: I0220 17:45:06.895879 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c561e953-8a30-4fc8-b054-d4d79cec06be" path="/var/lib/kubelet/pods/c561e953-8a30-4fc8-b054-d4d79cec06be/volumes" Feb 20 17:45:10 crc kubenswrapper[4697]: I0220 17:45:10.879055 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:45:10 crc kubenswrapper[4697]: E0220 17:45:10.879764 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:45:23 crc kubenswrapper[4697]: I0220 17:45:23.879277 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:45:23 crc kubenswrapper[4697]: E0220 17:45:23.880946 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:45:38 crc kubenswrapper[4697]: I0220 17:45:38.877153 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:45:38 crc kubenswrapper[4697]: E0220 17:45:38.878231 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:45:47 crc kubenswrapper[4697]: I0220 17:45:47.323685 4697 scope.go:117] "RemoveContainer" containerID="9d845d7cd5adac4bc5945aea3fb2bbab479433d0c129e31f7c5f8f92db2d382b" Feb 20 17:45:51 crc kubenswrapper[4697]: I0220 17:45:51.877591 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:45:51 crc kubenswrapper[4697]: E0220 17:45:51.878924 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:46:05 crc kubenswrapper[4697]: I0220 17:46:05.877479 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:46:05 crc kubenswrapper[4697]: E0220 17:46:05.878177 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:46:17 crc kubenswrapper[4697]: I0220 17:46:17.878514 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:46:17 crc kubenswrapper[4697]: E0220 17:46:17.879385 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:46:31 crc kubenswrapper[4697]: I0220 17:46:31.877765 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:46:31 crc kubenswrapper[4697]: E0220 17:46:31.879537 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:46:44 crc kubenswrapper[4697]: I0220 17:46:44.877191 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:46:44 crc kubenswrapper[4697]: E0220 17:46:44.878428 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:46:57 crc kubenswrapper[4697]: I0220 17:46:57.877129 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:46:57 crc kubenswrapper[4697]: E0220 17:46:57.878029 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:47:12 crc kubenswrapper[4697]: I0220 17:47:12.883942 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:47:12 crc kubenswrapper[4697]: E0220 17:47:12.885049 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:47:23 crc kubenswrapper[4697]: I0220 17:47:23.877147 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:47:23 crc kubenswrapper[4697]: E0220 17:47:23.878065 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:47:35 crc kubenswrapper[4697]: I0220 17:47:35.878140 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:47:37 crc kubenswrapper[4697]: I0220 17:47:37.029422 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"41bad0139a8715cab8dbf07356aee82331b7993f510c91a6156eaac1ab9f7fa2"} Feb 20 17:50:01 crc kubenswrapper[4697]: I0220 17:50:01.185226 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:50:01 crc kubenswrapper[4697]: I0220 17:50:01.186050 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:50:31 crc kubenswrapper[4697]: I0220 17:50:31.185220 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:50:31 crc kubenswrapper[4697]: I0220 17:50:31.185909 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.365518 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vbb5r"] Feb 20 17:50:36 crc kubenswrapper[4697]: E0220 17:50:36.366860 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9def3e65-38cf-4ae4-b9ed-0f3446b589eb" containerName="collect-profiles" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.366911 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="9def3e65-38cf-4ae4-b9ed-0f3446b589eb" containerName="collect-profiles" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.367290 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="9def3e65-38cf-4ae4-b9ed-0f3446b589eb" containerName="collect-profiles" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.369165 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.403078 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vbb5r"] Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.485527 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ead966f-3c06-4bd7-9be8-ba47f40fe568-catalog-content\") pod \"redhat-operators-vbb5r\" (UID: \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\") " pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.485581 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ead966f-3c06-4bd7-9be8-ba47f40fe568-utilities\") pod \"redhat-operators-vbb5r\" (UID: \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\") " pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.485990 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6llb4\" (UniqueName: \"kubernetes.io/projected/6ead966f-3c06-4bd7-9be8-ba47f40fe568-kube-api-access-6llb4\") pod \"redhat-operators-vbb5r\" (UID: \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\") " pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.588471 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6llb4\" (UniqueName: \"kubernetes.io/projected/6ead966f-3c06-4bd7-9be8-ba47f40fe568-kube-api-access-6llb4\") pod \"redhat-operators-vbb5r\" (UID: \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\") " pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.588590 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ead966f-3c06-4bd7-9be8-ba47f40fe568-catalog-content\") pod \"redhat-operators-vbb5r\" (UID: \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\") " pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.588610 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ead966f-3c06-4bd7-9be8-ba47f40fe568-utilities\") pod \"redhat-operators-vbb5r\" (UID: \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\") " pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.589063 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ead966f-3c06-4bd7-9be8-ba47f40fe568-utilities\") pod \"redhat-operators-vbb5r\" (UID: \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\") " pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.589242 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ead966f-3c06-4bd7-9be8-ba47f40fe568-catalog-content\") pod \"redhat-operators-vbb5r\" (UID: \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\") " pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.626860 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6llb4\" (UniqueName: \"kubernetes.io/projected/6ead966f-3c06-4bd7-9be8-ba47f40fe568-kube-api-access-6llb4\") pod \"redhat-operators-vbb5r\" (UID: \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\") " pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:36 crc kubenswrapper[4697]: I0220 17:50:36.697579 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:37 crc kubenswrapper[4697]: I0220 17:50:37.194896 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vbb5r"] Feb 20 17:50:37 crc kubenswrapper[4697]: I0220 17:50:37.444382 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbb5r" event={"ID":"6ead966f-3c06-4bd7-9be8-ba47f40fe568","Type":"ContainerStarted","Data":"46109593bf8f53f4935cd0ee6a00726b3a69161161afd6ad5c2fef3491bc2301"} Feb 20 17:50:37 crc kubenswrapper[4697]: I0220 17:50:37.444723 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbb5r" event={"ID":"6ead966f-3c06-4bd7-9be8-ba47f40fe568","Type":"ContainerStarted","Data":"b28f2c463e25528b752795530bfa94b01cb57792ed5f4c43f858a8b481092779"} Feb 20 17:50:37 crc kubenswrapper[4697]: I0220 17:50:37.452507 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 17:50:38 crc kubenswrapper[4697]: I0220 17:50:38.466663 4697 generic.go:334] "Generic (PLEG): container finished" podID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" containerID="46109593bf8f53f4935cd0ee6a00726b3a69161161afd6ad5c2fef3491bc2301" exitCode=0 Feb 20 17:50:38 crc kubenswrapper[4697]: I0220 17:50:38.467101 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbb5r" event={"ID":"6ead966f-3c06-4bd7-9be8-ba47f40fe568","Type":"ContainerDied","Data":"46109593bf8f53f4935cd0ee6a00726b3a69161161afd6ad5c2fef3491bc2301"} Feb 20 17:50:40 crc kubenswrapper[4697]: I0220 17:50:40.489333 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbb5r" event={"ID":"6ead966f-3c06-4bd7-9be8-ba47f40fe568","Type":"ContainerStarted","Data":"9d8fffbd82857492f6fc992405eaef3b57391af790eee169e2474465281da54e"} Feb 20 17:50:44 crc kubenswrapper[4697]: I0220 17:50:44.530174 4697 generic.go:334] "Generic (PLEG): container finished" podID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" containerID="9d8fffbd82857492f6fc992405eaef3b57391af790eee169e2474465281da54e" exitCode=0 Feb 20 17:50:44 crc kubenswrapper[4697]: I0220 17:50:44.530285 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbb5r" event={"ID":"6ead966f-3c06-4bd7-9be8-ba47f40fe568","Type":"ContainerDied","Data":"9d8fffbd82857492f6fc992405eaef3b57391af790eee169e2474465281da54e"} Feb 20 17:50:45 crc kubenswrapper[4697]: I0220 17:50:45.545010 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbb5r" event={"ID":"6ead966f-3c06-4bd7-9be8-ba47f40fe568","Type":"ContainerStarted","Data":"f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740"} Feb 20 17:50:45 crc kubenswrapper[4697]: I0220 17:50:45.577067 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vbb5r" podStartSLOduration=2.107639731 podStartE2EDuration="9.577048059s" podCreationTimestamp="2026-02-20 17:50:36 +0000 UTC" firstStartedPulling="2026-02-20 17:50:37.452203606 +0000 UTC m=+4745.232249014" lastFinishedPulling="2026-02-20 17:50:44.921611894 +0000 UTC m=+4752.701657342" observedRunningTime="2026-02-20 17:50:45.564022722 +0000 UTC m=+4753.344068130" watchObservedRunningTime="2026-02-20 17:50:45.577048059 +0000 UTC m=+4753.357093467" Feb 20 17:50:46 crc kubenswrapper[4697]: I0220 17:50:46.698481 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:46 crc kubenswrapper[4697]: I0220 17:50:46.698568 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:47 crc kubenswrapper[4697]: I0220 17:50:47.764354 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vbb5r" podUID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" containerName="registry-server" probeResult="failure" output=< Feb 20 17:50:47 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 17:50:47 crc kubenswrapper[4697]: > Feb 20 17:50:56 crc kubenswrapper[4697]: I0220 17:50:56.742621 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:56 crc kubenswrapper[4697]: I0220 17:50:56.794982 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:56 crc kubenswrapper[4697]: I0220 17:50:56.992971 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vbb5r"] Feb 20 17:50:58 crc kubenswrapper[4697]: I0220 17:50:58.665676 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vbb5r" podUID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" containerName="registry-server" containerID="cri-o://f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740" gracePeriod=2 Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.144191 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.261057 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6llb4\" (UniqueName: \"kubernetes.io/projected/6ead966f-3c06-4bd7-9be8-ba47f40fe568-kube-api-access-6llb4\") pod \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\" (UID: \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\") " Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.261271 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ead966f-3c06-4bd7-9be8-ba47f40fe568-catalog-content\") pod \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\" (UID: \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\") " Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.261337 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ead966f-3c06-4bd7-9be8-ba47f40fe568-utilities\") pod \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\" (UID: \"6ead966f-3c06-4bd7-9be8-ba47f40fe568\") " Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.263112 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ead966f-3c06-4bd7-9be8-ba47f40fe568-utilities" (OuterVolumeSpecName: "utilities") pod "6ead966f-3c06-4bd7-9be8-ba47f40fe568" (UID: "6ead966f-3c06-4bd7-9be8-ba47f40fe568"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.263364 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ead966f-3c06-4bd7-9be8-ba47f40fe568-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.268574 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ead966f-3c06-4bd7-9be8-ba47f40fe568-kube-api-access-6llb4" (OuterVolumeSpecName: "kube-api-access-6llb4") pod "6ead966f-3c06-4bd7-9be8-ba47f40fe568" (UID: "6ead966f-3c06-4bd7-9be8-ba47f40fe568"). InnerVolumeSpecName "kube-api-access-6llb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.365039 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6llb4\" (UniqueName: \"kubernetes.io/projected/6ead966f-3c06-4bd7-9be8-ba47f40fe568-kube-api-access-6llb4\") on node \"crc\" DevicePath \"\"" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.375170 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ead966f-3c06-4bd7-9be8-ba47f40fe568-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ead966f-3c06-4bd7-9be8-ba47f40fe568" (UID: "6ead966f-3c06-4bd7-9be8-ba47f40fe568"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.467462 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ead966f-3c06-4bd7-9be8-ba47f40fe568-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.676888 4697 generic.go:334] "Generic (PLEG): container finished" podID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" containerID="f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740" exitCode=0 Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.676944 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vbb5r" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.676973 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbb5r" event={"ID":"6ead966f-3c06-4bd7-9be8-ba47f40fe568","Type":"ContainerDied","Data":"f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740"} Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.677792 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbb5r" event={"ID":"6ead966f-3c06-4bd7-9be8-ba47f40fe568","Type":"ContainerDied","Data":"b28f2c463e25528b752795530bfa94b01cb57792ed5f4c43f858a8b481092779"} Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.677830 4697 scope.go:117] "RemoveContainer" containerID="f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.708945 4697 scope.go:117] "RemoveContainer" containerID="9d8fffbd82857492f6fc992405eaef3b57391af790eee169e2474465281da54e" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.712287 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vbb5r"] Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.720962 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vbb5r"] Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.743047 4697 scope.go:117] "RemoveContainer" containerID="46109593bf8f53f4935cd0ee6a00726b3a69161161afd6ad5c2fef3491bc2301" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.824145 4697 scope.go:117] "RemoveContainer" containerID="f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740" Feb 20 17:50:59 crc kubenswrapper[4697]: E0220 17:50:59.824749 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740\": container with ID starting with f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740 not found: ID does not exist" containerID="f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.824801 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740"} err="failed to get container status \"f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740\": rpc error: code = NotFound desc = could not find container \"f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740\": container with ID starting with f7c5501d4f75138fa0c21da5f54f35c41d1a8a0ebf00dbb44c3f95b5920f7740 not found: ID does not exist" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.824828 4697 scope.go:117] "RemoveContainer" containerID="9d8fffbd82857492f6fc992405eaef3b57391af790eee169e2474465281da54e" Feb 20 17:50:59 crc kubenswrapper[4697]: E0220 17:50:59.825177 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d8fffbd82857492f6fc992405eaef3b57391af790eee169e2474465281da54e\": container with ID starting with 9d8fffbd82857492f6fc992405eaef3b57391af790eee169e2474465281da54e not found: ID does not exist" containerID="9d8fffbd82857492f6fc992405eaef3b57391af790eee169e2474465281da54e" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.825220 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8fffbd82857492f6fc992405eaef3b57391af790eee169e2474465281da54e"} err="failed to get container status \"9d8fffbd82857492f6fc992405eaef3b57391af790eee169e2474465281da54e\": rpc error: code = NotFound desc = could not find container \"9d8fffbd82857492f6fc992405eaef3b57391af790eee169e2474465281da54e\": container with ID starting with 9d8fffbd82857492f6fc992405eaef3b57391af790eee169e2474465281da54e not found: ID does not exist" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.825248 4697 scope.go:117] "RemoveContainer" containerID="46109593bf8f53f4935cd0ee6a00726b3a69161161afd6ad5c2fef3491bc2301" Feb 20 17:50:59 crc kubenswrapper[4697]: E0220 17:50:59.825777 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46109593bf8f53f4935cd0ee6a00726b3a69161161afd6ad5c2fef3491bc2301\": container with ID starting with 46109593bf8f53f4935cd0ee6a00726b3a69161161afd6ad5c2fef3491bc2301 not found: ID does not exist" containerID="46109593bf8f53f4935cd0ee6a00726b3a69161161afd6ad5c2fef3491bc2301" Feb 20 17:50:59 crc kubenswrapper[4697]: I0220 17:50:59.825804 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46109593bf8f53f4935cd0ee6a00726b3a69161161afd6ad5c2fef3491bc2301"} err="failed to get container status \"46109593bf8f53f4935cd0ee6a00726b3a69161161afd6ad5c2fef3491bc2301\": rpc error: code = NotFound desc = could not find container \"46109593bf8f53f4935cd0ee6a00726b3a69161161afd6ad5c2fef3491bc2301\": container with ID starting with 46109593bf8f53f4935cd0ee6a00726b3a69161161afd6ad5c2fef3491bc2301 not found: ID does not exist" Feb 20 17:51:00 crc kubenswrapper[4697]: I0220 17:51:00.896317 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" path="/var/lib/kubelet/pods/6ead966f-3c06-4bd7-9be8-ba47f40fe568/volumes" Feb 20 17:51:01 crc kubenswrapper[4697]: I0220 17:51:01.185738 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:51:01 crc kubenswrapper[4697]: I0220 17:51:01.185830 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:51:01 crc kubenswrapper[4697]: I0220 17:51:01.185895 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 17:51:01 crc kubenswrapper[4697]: I0220 17:51:01.187048 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"41bad0139a8715cab8dbf07356aee82331b7993f510c91a6156eaac1ab9f7fa2"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 17:51:01 crc kubenswrapper[4697]: I0220 17:51:01.187155 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://41bad0139a8715cab8dbf07356aee82331b7993f510c91a6156eaac1ab9f7fa2" gracePeriod=600 Feb 20 17:51:01 crc kubenswrapper[4697]: I0220 17:51:01.701102 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="41bad0139a8715cab8dbf07356aee82331b7993f510c91a6156eaac1ab9f7fa2" exitCode=0 Feb 20 17:51:01 crc kubenswrapper[4697]: I0220 17:51:01.701366 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"41bad0139a8715cab8dbf07356aee82331b7993f510c91a6156eaac1ab9f7fa2"} Feb 20 17:51:01 crc kubenswrapper[4697]: I0220 17:51:01.701462 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712"} Feb 20 17:51:01 crc kubenswrapper[4697]: I0220 17:51:01.701488 4697 scope.go:117] "RemoveContainer" containerID="589e0d2d7089d023047f6b5b1cf6537edce8570d48c1166a3cf1237275816d6b" Feb 20 17:51:16 crc kubenswrapper[4697]: E0220 17:51:16.612494 4697 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.44:56242->38.102.83.44:39463: write tcp 38.102.83.44:56242->38.102.83.44:39463: write: broken pipe Feb 20 17:51:39 crc kubenswrapper[4697]: I0220 17:51:39.844023 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rdtx7"] Feb 20 17:51:39 crc kubenswrapper[4697]: E0220 17:51:39.844886 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" containerName="extract-content" Feb 20 17:51:39 crc kubenswrapper[4697]: I0220 17:51:39.844899 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" containerName="extract-content" Feb 20 17:51:39 crc kubenswrapper[4697]: E0220 17:51:39.844916 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" containerName="extract-utilities" Feb 20 17:51:39 crc kubenswrapper[4697]: I0220 17:51:39.844923 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" containerName="extract-utilities" Feb 20 17:51:39 crc kubenswrapper[4697]: E0220 17:51:39.844946 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" containerName="registry-server" Feb 20 17:51:39 crc kubenswrapper[4697]: I0220 17:51:39.844953 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" containerName="registry-server" Feb 20 17:51:39 crc kubenswrapper[4697]: I0220 17:51:39.845141 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ead966f-3c06-4bd7-9be8-ba47f40fe568" containerName="registry-server" Feb 20 17:51:39 crc kubenswrapper[4697]: I0220 17:51:39.846516 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:39 crc kubenswrapper[4697]: I0220 17:51:39.856531 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rdtx7"] Feb 20 17:51:39 crc kubenswrapper[4697]: I0220 17:51:39.989601 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjcmk\" (UniqueName: \"kubernetes.io/projected/a9515452-27de-4805-9b5e-a312a575f042-kube-api-access-sjcmk\") pod \"community-operators-rdtx7\" (UID: \"a9515452-27de-4805-9b5e-a312a575f042\") " pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:39 crc kubenswrapper[4697]: I0220 17:51:39.989686 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9515452-27de-4805-9b5e-a312a575f042-catalog-content\") pod \"community-operators-rdtx7\" (UID: \"a9515452-27de-4805-9b5e-a312a575f042\") " pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:39 crc kubenswrapper[4697]: I0220 17:51:39.989716 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9515452-27de-4805-9b5e-a312a575f042-utilities\") pod \"community-operators-rdtx7\" (UID: \"a9515452-27de-4805-9b5e-a312a575f042\") " pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:40 crc kubenswrapper[4697]: I0220 17:51:40.091643 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjcmk\" (UniqueName: \"kubernetes.io/projected/a9515452-27de-4805-9b5e-a312a575f042-kube-api-access-sjcmk\") pod \"community-operators-rdtx7\" (UID: \"a9515452-27de-4805-9b5e-a312a575f042\") " pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:40 crc kubenswrapper[4697]: I0220 17:51:40.091956 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9515452-27de-4805-9b5e-a312a575f042-catalog-content\") pod \"community-operators-rdtx7\" (UID: \"a9515452-27de-4805-9b5e-a312a575f042\") " pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:40 crc kubenswrapper[4697]: I0220 17:51:40.092109 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9515452-27de-4805-9b5e-a312a575f042-utilities\") pod \"community-operators-rdtx7\" (UID: \"a9515452-27de-4805-9b5e-a312a575f042\") " pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:40 crc kubenswrapper[4697]: I0220 17:51:40.092412 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9515452-27de-4805-9b5e-a312a575f042-catalog-content\") pod \"community-operators-rdtx7\" (UID: \"a9515452-27de-4805-9b5e-a312a575f042\") " pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:40 crc kubenswrapper[4697]: I0220 17:51:40.092589 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9515452-27de-4805-9b5e-a312a575f042-utilities\") pod \"community-operators-rdtx7\" (UID: \"a9515452-27de-4805-9b5e-a312a575f042\") " pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:40 crc kubenswrapper[4697]: I0220 17:51:40.131493 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjcmk\" (UniqueName: \"kubernetes.io/projected/a9515452-27de-4805-9b5e-a312a575f042-kube-api-access-sjcmk\") pod \"community-operators-rdtx7\" (UID: \"a9515452-27de-4805-9b5e-a312a575f042\") " pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:40 crc kubenswrapper[4697]: I0220 17:51:40.190856 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:40 crc kubenswrapper[4697]: I0220 17:51:40.771113 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rdtx7"] Feb 20 17:51:41 crc kubenswrapper[4697]: I0220 17:51:41.081523 4697 generic.go:334] "Generic (PLEG): container finished" podID="a9515452-27de-4805-9b5e-a312a575f042" containerID="7e80227a09d79c70706790120f786ea42ac05d446989132f456907d3e6ca98f9" exitCode=0 Feb 20 17:51:41 crc kubenswrapper[4697]: I0220 17:51:41.081579 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdtx7" event={"ID":"a9515452-27de-4805-9b5e-a312a575f042","Type":"ContainerDied","Data":"7e80227a09d79c70706790120f786ea42ac05d446989132f456907d3e6ca98f9"} Feb 20 17:51:41 crc kubenswrapper[4697]: I0220 17:51:41.081822 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdtx7" event={"ID":"a9515452-27de-4805-9b5e-a312a575f042","Type":"ContainerStarted","Data":"6c99a23d6e4df6511a48b0ebe4a770e1908c4fcb10a596a1fdf6d90bad72681c"} Feb 20 17:51:42 crc kubenswrapper[4697]: I0220 17:51:42.106809 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdtx7" event={"ID":"a9515452-27de-4805-9b5e-a312a575f042","Type":"ContainerStarted","Data":"5ce6e097e942ab1e7714a681a5bc404b3c865f224808b9d8b884689c4a2a4eaa"} Feb 20 17:51:44 crc kubenswrapper[4697]: I0220 17:51:44.133019 4697 generic.go:334] "Generic (PLEG): container finished" podID="a9515452-27de-4805-9b5e-a312a575f042" containerID="5ce6e097e942ab1e7714a681a5bc404b3c865f224808b9d8b884689c4a2a4eaa" exitCode=0 Feb 20 17:51:44 crc kubenswrapper[4697]: I0220 17:51:44.133076 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdtx7" event={"ID":"a9515452-27de-4805-9b5e-a312a575f042","Type":"ContainerDied","Data":"5ce6e097e942ab1e7714a681a5bc404b3c865f224808b9d8b884689c4a2a4eaa"} Feb 20 17:51:45 crc kubenswrapper[4697]: I0220 17:51:45.148491 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdtx7" event={"ID":"a9515452-27de-4805-9b5e-a312a575f042","Type":"ContainerStarted","Data":"147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1"} Feb 20 17:51:45 crc kubenswrapper[4697]: I0220 17:51:45.179950 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rdtx7" podStartSLOduration=2.528311435 podStartE2EDuration="6.179926918s" podCreationTimestamp="2026-02-20 17:51:39 +0000 UTC" firstStartedPulling="2026-02-20 17:51:41.083964695 +0000 UTC m=+4808.864010103" lastFinishedPulling="2026-02-20 17:51:44.735580138 +0000 UTC m=+4812.515625586" observedRunningTime="2026-02-20 17:51:45.168823668 +0000 UTC m=+4812.948869076" watchObservedRunningTime="2026-02-20 17:51:45.179926918 +0000 UTC m=+4812.959972346" Feb 20 17:51:50 crc kubenswrapper[4697]: I0220 17:51:50.191379 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:50 crc kubenswrapper[4697]: I0220 17:51:50.191820 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:50 crc kubenswrapper[4697]: I0220 17:51:50.252349 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:51 crc kubenswrapper[4697]: I0220 17:51:51.265355 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:51 crc kubenswrapper[4697]: I0220 17:51:51.315485 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rdtx7"] Feb 20 17:51:53 crc kubenswrapper[4697]: I0220 17:51:53.225151 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rdtx7" podUID="a9515452-27de-4805-9b5e-a312a575f042" containerName="registry-server" containerID="cri-o://147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1" gracePeriod=2 Feb 20 17:51:53 crc kubenswrapper[4697]: I0220 17:51:53.864275 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.044573 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9515452-27de-4805-9b5e-a312a575f042-catalog-content\") pod \"a9515452-27de-4805-9b5e-a312a575f042\" (UID: \"a9515452-27de-4805-9b5e-a312a575f042\") " Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.044748 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9515452-27de-4805-9b5e-a312a575f042-utilities\") pod \"a9515452-27de-4805-9b5e-a312a575f042\" (UID: \"a9515452-27de-4805-9b5e-a312a575f042\") " Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.044806 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjcmk\" (UniqueName: \"kubernetes.io/projected/a9515452-27de-4805-9b5e-a312a575f042-kube-api-access-sjcmk\") pod \"a9515452-27de-4805-9b5e-a312a575f042\" (UID: \"a9515452-27de-4805-9b5e-a312a575f042\") " Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.046176 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9515452-27de-4805-9b5e-a312a575f042-utilities" (OuterVolumeSpecName: "utilities") pod "a9515452-27de-4805-9b5e-a312a575f042" (UID: "a9515452-27de-4805-9b5e-a312a575f042"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.053621 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9515452-27de-4805-9b5e-a312a575f042-kube-api-access-sjcmk" (OuterVolumeSpecName: "kube-api-access-sjcmk") pod "a9515452-27de-4805-9b5e-a312a575f042" (UID: "a9515452-27de-4805-9b5e-a312a575f042"). InnerVolumeSpecName "kube-api-access-sjcmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.099488 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9515452-27de-4805-9b5e-a312a575f042-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9515452-27de-4805-9b5e-a312a575f042" (UID: "a9515452-27de-4805-9b5e-a312a575f042"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.147055 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9515452-27de-4805-9b5e-a312a575f042-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.147092 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9515452-27de-4805-9b5e-a312a575f042-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.147105 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjcmk\" (UniqueName: \"kubernetes.io/projected/a9515452-27de-4805-9b5e-a312a575f042-kube-api-access-sjcmk\") on node \"crc\" DevicePath \"\"" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.236974 4697 generic.go:334] "Generic (PLEG): container finished" podID="a9515452-27de-4805-9b5e-a312a575f042" containerID="147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1" exitCode=0 Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.237034 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdtx7" event={"ID":"a9515452-27de-4805-9b5e-a312a575f042","Type":"ContainerDied","Data":"147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1"} Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.237051 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdtx7" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.237074 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdtx7" event={"ID":"a9515452-27de-4805-9b5e-a312a575f042","Type":"ContainerDied","Data":"6c99a23d6e4df6511a48b0ebe4a770e1908c4fcb10a596a1fdf6d90bad72681c"} Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.237096 4697 scope.go:117] "RemoveContainer" containerID="147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.264925 4697 scope.go:117] "RemoveContainer" containerID="5ce6e097e942ab1e7714a681a5bc404b3c865f224808b9d8b884689c4a2a4eaa" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.286620 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rdtx7"] Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.300534 4697 scope.go:117] "RemoveContainer" containerID="7e80227a09d79c70706790120f786ea42ac05d446989132f456907d3e6ca98f9" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.303353 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rdtx7"] Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.369189 4697 scope.go:117] "RemoveContainer" containerID="147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1" Feb 20 17:51:54 crc kubenswrapper[4697]: E0220 17:51:54.369720 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1\": container with ID starting with 147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1 not found: ID does not exist" containerID="147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.369775 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1"} err="failed to get container status \"147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1\": rpc error: code = NotFound desc = could not find container \"147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1\": container with ID starting with 147b18e905265d00ec03c731dd6f6b7ea870e4a4ba015b347e67d703d646ebe1 not found: ID does not exist" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.369811 4697 scope.go:117] "RemoveContainer" containerID="5ce6e097e942ab1e7714a681a5bc404b3c865f224808b9d8b884689c4a2a4eaa" Feb 20 17:51:54 crc kubenswrapper[4697]: E0220 17:51:54.370387 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce6e097e942ab1e7714a681a5bc404b3c865f224808b9d8b884689c4a2a4eaa\": container with ID starting with 5ce6e097e942ab1e7714a681a5bc404b3c865f224808b9d8b884689c4a2a4eaa not found: ID does not exist" containerID="5ce6e097e942ab1e7714a681a5bc404b3c865f224808b9d8b884689c4a2a4eaa" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.370412 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce6e097e942ab1e7714a681a5bc404b3c865f224808b9d8b884689c4a2a4eaa"} err="failed to get container status \"5ce6e097e942ab1e7714a681a5bc404b3c865f224808b9d8b884689c4a2a4eaa\": rpc error: code = NotFound desc = could not find container \"5ce6e097e942ab1e7714a681a5bc404b3c865f224808b9d8b884689c4a2a4eaa\": container with ID starting with 5ce6e097e942ab1e7714a681a5bc404b3c865f224808b9d8b884689c4a2a4eaa not found: ID does not exist" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.370446 4697 scope.go:117] "RemoveContainer" containerID="7e80227a09d79c70706790120f786ea42ac05d446989132f456907d3e6ca98f9" Feb 20 17:51:54 crc kubenswrapper[4697]: E0220 17:51:54.370773 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e80227a09d79c70706790120f786ea42ac05d446989132f456907d3e6ca98f9\": container with ID starting with 7e80227a09d79c70706790120f786ea42ac05d446989132f456907d3e6ca98f9 not found: ID does not exist" containerID="7e80227a09d79c70706790120f786ea42ac05d446989132f456907d3e6ca98f9" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.370811 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e80227a09d79c70706790120f786ea42ac05d446989132f456907d3e6ca98f9"} err="failed to get container status \"7e80227a09d79c70706790120f786ea42ac05d446989132f456907d3e6ca98f9\": rpc error: code = NotFound desc = could not find container \"7e80227a09d79c70706790120f786ea42ac05d446989132f456907d3e6ca98f9\": container with ID starting with 7e80227a09d79c70706790120f786ea42ac05d446989132f456907d3e6ca98f9 not found: ID does not exist" Feb 20 17:51:54 crc kubenswrapper[4697]: I0220 17:51:54.890965 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9515452-27de-4805-9b5e-a312a575f042" path="/var/lib/kubelet/pods/a9515452-27de-4805-9b5e-a312a575f042/volumes" Feb 20 17:53:01 crc kubenswrapper[4697]: I0220 17:53:01.185415 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:53:01 crc kubenswrapper[4697]: I0220 17:53:01.186073 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.014242 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7bnd2"] Feb 20 17:53:10 crc kubenswrapper[4697]: E0220 17:53:10.015248 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9515452-27de-4805-9b5e-a312a575f042" containerName="registry-server" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.015266 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9515452-27de-4805-9b5e-a312a575f042" containerName="registry-server" Feb 20 17:53:10 crc kubenswrapper[4697]: E0220 17:53:10.015294 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9515452-27de-4805-9b5e-a312a575f042" containerName="extract-utilities" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.015302 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9515452-27de-4805-9b5e-a312a575f042" containerName="extract-utilities" Feb 20 17:53:10 crc kubenswrapper[4697]: E0220 17:53:10.015315 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9515452-27de-4805-9b5e-a312a575f042" containerName="extract-content" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.015323 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9515452-27de-4805-9b5e-a312a575f042" containerName="extract-content" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.015632 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9515452-27de-4805-9b5e-a312a575f042" containerName="registry-server" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.017347 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.025258 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bnd2"] Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.112982 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc9ql\" (UniqueName: \"kubernetes.io/projected/4f5558b2-7599-40a4-8e11-e27f94544f09-kube-api-access-tc9ql\") pod \"certified-operators-7bnd2\" (UID: \"4f5558b2-7599-40a4-8e11-e27f94544f09\") " pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.113194 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f5558b2-7599-40a4-8e11-e27f94544f09-utilities\") pod \"certified-operators-7bnd2\" (UID: \"4f5558b2-7599-40a4-8e11-e27f94544f09\") " pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.113257 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f5558b2-7599-40a4-8e11-e27f94544f09-catalog-content\") pod \"certified-operators-7bnd2\" (UID: \"4f5558b2-7599-40a4-8e11-e27f94544f09\") " pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.215028 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc9ql\" (UniqueName: \"kubernetes.io/projected/4f5558b2-7599-40a4-8e11-e27f94544f09-kube-api-access-tc9ql\") pod \"certified-operators-7bnd2\" (UID: \"4f5558b2-7599-40a4-8e11-e27f94544f09\") " pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.215383 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f5558b2-7599-40a4-8e11-e27f94544f09-utilities\") pod \"certified-operators-7bnd2\" (UID: \"4f5558b2-7599-40a4-8e11-e27f94544f09\") " pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.215521 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f5558b2-7599-40a4-8e11-e27f94544f09-catalog-content\") pod \"certified-operators-7bnd2\" (UID: \"4f5558b2-7599-40a4-8e11-e27f94544f09\") " pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.215846 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f5558b2-7599-40a4-8e11-e27f94544f09-utilities\") pod \"certified-operators-7bnd2\" (UID: \"4f5558b2-7599-40a4-8e11-e27f94544f09\") " pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.215961 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f5558b2-7599-40a4-8e11-e27f94544f09-catalog-content\") pod \"certified-operators-7bnd2\" (UID: \"4f5558b2-7599-40a4-8e11-e27f94544f09\") " pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.233903 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc9ql\" (UniqueName: \"kubernetes.io/projected/4f5558b2-7599-40a4-8e11-e27f94544f09-kube-api-access-tc9ql\") pod \"certified-operators-7bnd2\" (UID: \"4f5558b2-7599-40a4-8e11-e27f94544f09\") " pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.370696 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:10 crc kubenswrapper[4697]: W0220 17:53:10.908895 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f5558b2_7599_40a4_8e11_e27f94544f09.slice/crio-3c9062b8a0eebae6f9d2da0ba7660e2524c77e5af0e4a9bb7a0f74a7be2402cb WatchSource:0}: Error finding container 3c9062b8a0eebae6f9d2da0ba7660e2524c77e5af0e4a9bb7a0f74a7be2402cb: Status 404 returned error can't find the container with id 3c9062b8a0eebae6f9d2da0ba7660e2524c77e5af0e4a9bb7a0f74a7be2402cb Feb 20 17:53:10 crc kubenswrapper[4697]: I0220 17:53:10.913684 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bnd2"] Feb 20 17:53:11 crc kubenswrapper[4697]: I0220 17:53:11.009426 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bnd2" event={"ID":"4f5558b2-7599-40a4-8e11-e27f94544f09","Type":"ContainerStarted","Data":"3c9062b8a0eebae6f9d2da0ba7660e2524c77e5af0e4a9bb7a0f74a7be2402cb"} Feb 20 17:53:12 crc kubenswrapper[4697]: I0220 17:53:12.019880 4697 generic.go:334] "Generic (PLEG): container finished" podID="4f5558b2-7599-40a4-8e11-e27f94544f09" containerID="5b71679b3bc5303e2e3067ccbe5c61ba36b3efebaa86e59654239b48ef5dd462" exitCode=0 Feb 20 17:53:12 crc kubenswrapper[4697]: I0220 17:53:12.019996 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bnd2" event={"ID":"4f5558b2-7599-40a4-8e11-e27f94544f09","Type":"ContainerDied","Data":"5b71679b3bc5303e2e3067ccbe5c61ba36b3efebaa86e59654239b48ef5dd462"} Feb 20 17:53:13 crc kubenswrapper[4697]: I0220 17:53:13.031847 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bnd2" event={"ID":"4f5558b2-7599-40a4-8e11-e27f94544f09","Type":"ContainerStarted","Data":"f34ce46d23b5451031dbaabd6b96ea1156963cff088c733c113ecb2edd36bac2"} Feb 20 17:53:15 crc kubenswrapper[4697]: I0220 17:53:15.053039 4697 generic.go:334] "Generic (PLEG): container finished" podID="4f5558b2-7599-40a4-8e11-e27f94544f09" containerID="f34ce46d23b5451031dbaabd6b96ea1156963cff088c733c113ecb2edd36bac2" exitCode=0 Feb 20 17:53:15 crc kubenswrapper[4697]: I0220 17:53:15.053128 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bnd2" event={"ID":"4f5558b2-7599-40a4-8e11-e27f94544f09","Type":"ContainerDied","Data":"f34ce46d23b5451031dbaabd6b96ea1156963cff088c733c113ecb2edd36bac2"} Feb 20 17:53:16 crc kubenswrapper[4697]: I0220 17:53:16.065155 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bnd2" event={"ID":"4f5558b2-7599-40a4-8e11-e27f94544f09","Type":"ContainerStarted","Data":"ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd"} Feb 20 17:53:16 crc kubenswrapper[4697]: I0220 17:53:16.088703 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7bnd2" podStartSLOduration=3.670050704 podStartE2EDuration="7.088683738s" podCreationTimestamp="2026-02-20 17:53:09 +0000 UTC" firstStartedPulling="2026-02-20 17:53:12.024008175 +0000 UTC m=+4899.804053583" lastFinishedPulling="2026-02-20 17:53:15.442641209 +0000 UTC m=+4903.222686617" observedRunningTime="2026-02-20 17:53:16.087463998 +0000 UTC m=+4903.867509406" watchObservedRunningTime="2026-02-20 17:53:16.088683738 +0000 UTC m=+4903.868729146" Feb 20 17:53:20 crc kubenswrapper[4697]: I0220 17:53:20.371822 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:20 crc kubenswrapper[4697]: I0220 17:53:20.372527 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:20 crc kubenswrapper[4697]: I0220 17:53:20.923019 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:21 crc kubenswrapper[4697]: I0220 17:53:21.160700 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:21 crc kubenswrapper[4697]: I0220 17:53:21.209121 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7bnd2"] Feb 20 17:53:23 crc kubenswrapper[4697]: I0220 17:53:23.136300 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7bnd2" podUID="4f5558b2-7599-40a4-8e11-e27f94544f09" containerName="registry-server" containerID="cri-o://ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd" gracePeriod=2 Feb 20 17:53:23 crc kubenswrapper[4697]: I0220 17:53:23.683217 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:23 crc kubenswrapper[4697]: I0220 17:53:23.809276 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f5558b2-7599-40a4-8e11-e27f94544f09-catalog-content\") pod \"4f5558b2-7599-40a4-8e11-e27f94544f09\" (UID: \"4f5558b2-7599-40a4-8e11-e27f94544f09\") " Feb 20 17:53:23 crc kubenswrapper[4697]: I0220 17:53:23.809422 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc9ql\" (UniqueName: \"kubernetes.io/projected/4f5558b2-7599-40a4-8e11-e27f94544f09-kube-api-access-tc9ql\") pod \"4f5558b2-7599-40a4-8e11-e27f94544f09\" (UID: \"4f5558b2-7599-40a4-8e11-e27f94544f09\") " Feb 20 17:53:23 crc kubenswrapper[4697]: I0220 17:53:23.809482 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f5558b2-7599-40a4-8e11-e27f94544f09-utilities\") pod \"4f5558b2-7599-40a4-8e11-e27f94544f09\" (UID: \"4f5558b2-7599-40a4-8e11-e27f94544f09\") " Feb 20 17:53:23 crc kubenswrapper[4697]: I0220 17:53:23.810322 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f5558b2-7599-40a4-8e11-e27f94544f09-utilities" (OuterVolumeSpecName: "utilities") pod "4f5558b2-7599-40a4-8e11-e27f94544f09" (UID: "4f5558b2-7599-40a4-8e11-e27f94544f09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:53:23 crc kubenswrapper[4697]: I0220 17:53:23.816166 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5558b2-7599-40a4-8e11-e27f94544f09-kube-api-access-tc9ql" (OuterVolumeSpecName: "kube-api-access-tc9ql") pod "4f5558b2-7599-40a4-8e11-e27f94544f09" (UID: "4f5558b2-7599-40a4-8e11-e27f94544f09"). InnerVolumeSpecName "kube-api-access-tc9ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:53:23 crc kubenswrapper[4697]: I0220 17:53:23.868705 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f5558b2-7599-40a4-8e11-e27f94544f09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f5558b2-7599-40a4-8e11-e27f94544f09" (UID: "4f5558b2-7599-40a4-8e11-e27f94544f09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:53:23 crc kubenswrapper[4697]: I0220 17:53:23.911972 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f5558b2-7599-40a4-8e11-e27f94544f09-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:53:23 crc kubenswrapper[4697]: I0220 17:53:23.912018 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc9ql\" (UniqueName: \"kubernetes.io/projected/4f5558b2-7599-40a4-8e11-e27f94544f09-kube-api-access-tc9ql\") on node \"crc\" DevicePath \"\"" Feb 20 17:53:23 crc kubenswrapper[4697]: I0220 17:53:23.912029 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f5558b2-7599-40a4-8e11-e27f94544f09-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.145351 4697 generic.go:334] "Generic (PLEG): container finished" podID="4f5558b2-7599-40a4-8e11-e27f94544f09" containerID="ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd" exitCode=0 Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.145395 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bnd2" event={"ID":"4f5558b2-7599-40a4-8e11-e27f94544f09","Type":"ContainerDied","Data":"ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd"} Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.145423 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bnd2" event={"ID":"4f5558b2-7599-40a4-8e11-e27f94544f09","Type":"ContainerDied","Data":"3c9062b8a0eebae6f9d2da0ba7660e2524c77e5af0e4a9bb7a0f74a7be2402cb"} Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.145465 4697 scope.go:117] "RemoveContainer" containerID="ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd" Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.145588 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bnd2" Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.179447 4697 scope.go:117] "RemoveContainer" containerID="f34ce46d23b5451031dbaabd6b96ea1156963cff088c733c113ecb2edd36bac2" Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.182988 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7bnd2"] Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.201455 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7bnd2"] Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.203722 4697 scope.go:117] "RemoveContainer" containerID="5b71679b3bc5303e2e3067ccbe5c61ba36b3efebaa86e59654239b48ef5dd462" Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.258001 4697 scope.go:117] "RemoveContainer" containerID="ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd" Feb 20 17:53:24 crc kubenswrapper[4697]: E0220 17:53:24.258354 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd\": container with ID starting with ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd not found: ID does not exist" containerID="ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd" Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.258389 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd"} err="failed to get container status \"ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd\": rpc error: code = NotFound desc = could not find container \"ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd\": container with ID starting with ca379d928fce2250e16aead800a9e42210a94abe4e33f3bb182e06486ae556bd not found: ID does not exist" Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.258408 4697 scope.go:117] "RemoveContainer" containerID="f34ce46d23b5451031dbaabd6b96ea1156963cff088c733c113ecb2edd36bac2" Feb 20 17:53:24 crc kubenswrapper[4697]: E0220 17:53:24.258787 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f34ce46d23b5451031dbaabd6b96ea1156963cff088c733c113ecb2edd36bac2\": container with ID starting with f34ce46d23b5451031dbaabd6b96ea1156963cff088c733c113ecb2edd36bac2 not found: ID does not exist" containerID="f34ce46d23b5451031dbaabd6b96ea1156963cff088c733c113ecb2edd36bac2" Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.258815 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34ce46d23b5451031dbaabd6b96ea1156963cff088c733c113ecb2edd36bac2"} err="failed to get container status \"f34ce46d23b5451031dbaabd6b96ea1156963cff088c733c113ecb2edd36bac2\": rpc error: code = NotFound desc = could not find container \"f34ce46d23b5451031dbaabd6b96ea1156963cff088c733c113ecb2edd36bac2\": container with ID starting with f34ce46d23b5451031dbaabd6b96ea1156963cff088c733c113ecb2edd36bac2 not found: ID does not exist" Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.258829 4697 scope.go:117] "RemoveContainer" containerID="5b71679b3bc5303e2e3067ccbe5c61ba36b3efebaa86e59654239b48ef5dd462" Feb 20 17:53:24 crc kubenswrapper[4697]: E0220 17:53:24.259219 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b71679b3bc5303e2e3067ccbe5c61ba36b3efebaa86e59654239b48ef5dd462\": container with ID starting with 5b71679b3bc5303e2e3067ccbe5c61ba36b3efebaa86e59654239b48ef5dd462 not found: ID does not exist" containerID="5b71679b3bc5303e2e3067ccbe5c61ba36b3efebaa86e59654239b48ef5dd462" Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.259247 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b71679b3bc5303e2e3067ccbe5c61ba36b3efebaa86e59654239b48ef5dd462"} err="failed to get container status \"5b71679b3bc5303e2e3067ccbe5c61ba36b3efebaa86e59654239b48ef5dd462\": rpc error: code = NotFound desc = could not find container \"5b71679b3bc5303e2e3067ccbe5c61ba36b3efebaa86e59654239b48ef5dd462\": container with ID starting with 5b71679b3bc5303e2e3067ccbe5c61ba36b3efebaa86e59654239b48ef5dd462 not found: ID does not exist" Feb 20 17:53:24 crc kubenswrapper[4697]: I0220 17:53:24.888610 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f5558b2-7599-40a4-8e11-e27f94544f09" path="/var/lib/kubelet/pods/4f5558b2-7599-40a4-8e11-e27f94544f09/volumes" Feb 20 17:53:31 crc kubenswrapper[4697]: I0220 17:53:31.184567 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:53:31 crc kubenswrapper[4697]: I0220 17:53:31.185152 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:54:01 crc kubenswrapper[4697]: I0220 17:54:01.185513 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 17:54:01 crc kubenswrapper[4697]: I0220 17:54:01.186718 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 17:54:01 crc kubenswrapper[4697]: I0220 17:54:01.186841 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 17:54:01 crc kubenswrapper[4697]: I0220 17:54:01.188600 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 17:54:01 crc kubenswrapper[4697]: I0220 17:54:01.188729 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" gracePeriod=600 Feb 20 17:54:01 crc kubenswrapper[4697]: E0220 17:54:01.338933 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:54:01 crc kubenswrapper[4697]: I0220 17:54:01.507014 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" exitCode=0 Feb 20 17:54:01 crc kubenswrapper[4697]: I0220 17:54:01.507076 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712"} Feb 20 17:54:01 crc kubenswrapper[4697]: I0220 17:54:01.507131 4697 scope.go:117] "RemoveContainer" containerID="41bad0139a8715cab8dbf07356aee82331b7993f510c91a6156eaac1ab9f7fa2" Feb 20 17:54:01 crc kubenswrapper[4697]: I0220 17:54:01.508373 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:54:01 crc kubenswrapper[4697]: E0220 17:54:01.508908 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:54:11 crc kubenswrapper[4697]: I0220 17:54:11.879322 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:54:11 crc kubenswrapper[4697]: E0220 17:54:11.880618 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:54:23 crc kubenswrapper[4697]: I0220 17:54:23.877049 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:54:23 crc kubenswrapper[4697]: E0220 17:54:23.877742 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:54:35 crc kubenswrapper[4697]: I0220 17:54:35.878257 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:54:35 crc kubenswrapper[4697]: E0220 17:54:35.879351 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:54:49 crc kubenswrapper[4697]: I0220 17:54:49.877646 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:54:49 crc kubenswrapper[4697]: E0220 17:54:49.878700 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:55:03 crc kubenswrapper[4697]: I0220 17:55:03.877407 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:55:03 crc kubenswrapper[4697]: E0220 17:55:03.878303 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:55:17 crc kubenswrapper[4697]: I0220 17:55:17.877512 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:55:17 crc kubenswrapper[4697]: E0220 17:55:17.878267 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:55:30 crc kubenswrapper[4697]: I0220 17:55:30.877695 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:55:30 crc kubenswrapper[4697]: E0220 17:55:30.878325 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:55:45 crc kubenswrapper[4697]: I0220 17:55:45.877794 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:55:45 crc kubenswrapper[4697]: E0220 17:55:45.879653 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:55:57 crc kubenswrapper[4697]: I0220 17:55:57.876526 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:55:57 crc kubenswrapper[4697]: E0220 17:55:57.877312 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:56:10 crc kubenswrapper[4697]: I0220 17:56:10.877984 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:56:10 crc kubenswrapper[4697]: E0220 17:56:10.878934 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:56:25 crc kubenswrapper[4697]: I0220 17:56:25.877312 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:56:25 crc kubenswrapper[4697]: E0220 17:56:25.878033 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:56:38 crc kubenswrapper[4697]: I0220 17:56:38.881872 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:56:38 crc kubenswrapper[4697]: E0220 17:56:38.882597 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:56:53 crc kubenswrapper[4697]: I0220 17:56:53.877946 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:56:53 crc kubenswrapper[4697]: E0220 17:56:53.878716 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:57:05 crc kubenswrapper[4697]: I0220 17:57:05.877536 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:57:05 crc kubenswrapper[4697]: E0220 17:57:05.878539 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:57:20 crc kubenswrapper[4697]: I0220 17:57:20.877814 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:57:20 crc kubenswrapper[4697]: E0220 17:57:20.878579 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:57:31 crc kubenswrapper[4697]: I0220 17:57:31.877471 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:57:31 crc kubenswrapper[4697]: E0220 17:57:31.878259 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:57:45 crc kubenswrapper[4697]: I0220 17:57:45.876996 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:57:45 crc kubenswrapper[4697]: E0220 17:57:45.877643 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:57:57 crc kubenswrapper[4697]: I0220 17:57:57.877417 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:57:57 crc kubenswrapper[4697]: E0220 17:57:57.878046 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:58:10 crc kubenswrapper[4697]: I0220 17:58:10.877210 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:58:10 crc kubenswrapper[4697]: E0220 17:58:10.878006 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:58:21 crc kubenswrapper[4697]: I0220 17:58:21.876917 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:58:21 crc kubenswrapper[4697]: E0220 17:58:21.877682 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.359551 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r9rnm"] Feb 20 17:58:22 crc kubenswrapper[4697]: E0220 17:58:22.360557 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5558b2-7599-40a4-8e11-e27f94544f09" containerName="extract-content" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.360665 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5558b2-7599-40a4-8e11-e27f94544f09" containerName="extract-content" Feb 20 17:58:22 crc kubenswrapper[4697]: E0220 17:58:22.360770 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5558b2-7599-40a4-8e11-e27f94544f09" containerName="registry-server" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.360848 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5558b2-7599-40a4-8e11-e27f94544f09" containerName="registry-server" Feb 20 17:58:22 crc kubenswrapper[4697]: E0220 17:58:22.360929 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5558b2-7599-40a4-8e11-e27f94544f09" containerName="extract-utilities" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.360994 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5558b2-7599-40a4-8e11-e27f94544f09" containerName="extract-utilities" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.361236 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f5558b2-7599-40a4-8e11-e27f94544f09" containerName="registry-server" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.363175 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.396595 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9rnm"] Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.436821 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wbdd\" (UniqueName: \"kubernetes.io/projected/e41ea39e-c37c-44b9-8af5-344b32773c98-kube-api-access-5wbdd\") pod \"redhat-marketplace-r9rnm\" (UID: \"e41ea39e-c37c-44b9-8af5-344b32773c98\") " pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.437132 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e41ea39e-c37c-44b9-8af5-344b32773c98-catalog-content\") pod \"redhat-marketplace-r9rnm\" (UID: \"e41ea39e-c37c-44b9-8af5-344b32773c98\") " pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.437385 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e41ea39e-c37c-44b9-8af5-344b32773c98-utilities\") pod \"redhat-marketplace-r9rnm\" (UID: \"e41ea39e-c37c-44b9-8af5-344b32773c98\") " pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.539259 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e41ea39e-c37c-44b9-8af5-344b32773c98-utilities\") pod \"redhat-marketplace-r9rnm\" (UID: \"e41ea39e-c37c-44b9-8af5-344b32773c98\") " pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.539881 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e41ea39e-c37c-44b9-8af5-344b32773c98-utilities\") pod \"redhat-marketplace-r9rnm\" (UID: \"e41ea39e-c37c-44b9-8af5-344b32773c98\") " pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.539902 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wbdd\" (UniqueName: \"kubernetes.io/projected/e41ea39e-c37c-44b9-8af5-344b32773c98-kube-api-access-5wbdd\") pod \"redhat-marketplace-r9rnm\" (UID: \"e41ea39e-c37c-44b9-8af5-344b32773c98\") " pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.540063 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e41ea39e-c37c-44b9-8af5-344b32773c98-catalog-content\") pod \"redhat-marketplace-r9rnm\" (UID: \"e41ea39e-c37c-44b9-8af5-344b32773c98\") " pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.540528 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e41ea39e-c37c-44b9-8af5-344b32773c98-catalog-content\") pod \"redhat-marketplace-r9rnm\" (UID: \"e41ea39e-c37c-44b9-8af5-344b32773c98\") " pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.560560 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wbdd\" (UniqueName: \"kubernetes.io/projected/e41ea39e-c37c-44b9-8af5-344b32773c98-kube-api-access-5wbdd\") pod \"redhat-marketplace-r9rnm\" (UID: \"e41ea39e-c37c-44b9-8af5-344b32773c98\") " pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:22 crc kubenswrapper[4697]: I0220 17:58:22.695234 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:23 crc kubenswrapper[4697]: I0220 17:58:23.264048 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9rnm"] Feb 20 17:58:23 crc kubenswrapper[4697]: I0220 17:58:23.569215 4697 generic.go:334] "Generic (PLEG): container finished" podID="e41ea39e-c37c-44b9-8af5-344b32773c98" containerID="4cb4b53163b69db522c40e392a516b8911eced7243787a9c5dc0c8879a23b225" exitCode=0 Feb 20 17:58:23 crc kubenswrapper[4697]: I0220 17:58:23.569268 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9rnm" event={"ID":"e41ea39e-c37c-44b9-8af5-344b32773c98","Type":"ContainerDied","Data":"4cb4b53163b69db522c40e392a516b8911eced7243787a9c5dc0c8879a23b225"} Feb 20 17:58:23 crc kubenswrapper[4697]: I0220 17:58:23.569301 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9rnm" event={"ID":"e41ea39e-c37c-44b9-8af5-344b32773c98","Type":"ContainerStarted","Data":"968a539a3b7dd494e3a59e22a004d9516ce8475dd16ebeb154a3ff2113883178"} Feb 20 17:58:23 crc kubenswrapper[4697]: I0220 17:58:23.571259 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 17:58:24 crc kubenswrapper[4697]: I0220 17:58:24.581223 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9rnm" event={"ID":"e41ea39e-c37c-44b9-8af5-344b32773c98","Type":"ContainerStarted","Data":"9dc36024961282dbaac4f3ece632211a4bda1152e21b2bb4609983dd494ce984"} Feb 20 17:58:25 crc kubenswrapper[4697]: I0220 17:58:25.596670 4697 generic.go:334] "Generic (PLEG): container finished" podID="e41ea39e-c37c-44b9-8af5-344b32773c98" containerID="9dc36024961282dbaac4f3ece632211a4bda1152e21b2bb4609983dd494ce984" exitCode=0 Feb 20 17:58:25 crc kubenswrapper[4697]: I0220 17:58:25.596703 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9rnm" event={"ID":"e41ea39e-c37c-44b9-8af5-344b32773c98","Type":"ContainerDied","Data":"9dc36024961282dbaac4f3ece632211a4bda1152e21b2bb4609983dd494ce984"} Feb 20 17:58:26 crc kubenswrapper[4697]: I0220 17:58:26.614559 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9rnm" event={"ID":"e41ea39e-c37c-44b9-8af5-344b32773c98","Type":"ContainerStarted","Data":"c728c54126ce607b96ef95eb0c95586937fc490e3bb8a282ae9a61b35f55801d"} Feb 20 17:58:26 crc kubenswrapper[4697]: I0220 17:58:26.638813 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r9rnm" podStartSLOduration=2.200251844 podStartE2EDuration="4.638792693s" podCreationTimestamp="2026-02-20 17:58:22 +0000 UTC" firstStartedPulling="2026-02-20 17:58:23.571002004 +0000 UTC m=+5211.351047412" lastFinishedPulling="2026-02-20 17:58:26.009542853 +0000 UTC m=+5213.789588261" observedRunningTime="2026-02-20 17:58:26.632777547 +0000 UTC m=+5214.412822995" watchObservedRunningTime="2026-02-20 17:58:26.638792693 +0000 UTC m=+5214.418838101" Feb 20 17:58:32 crc kubenswrapper[4697]: I0220 17:58:32.695477 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:32 crc kubenswrapper[4697]: I0220 17:58:32.695936 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:32 crc kubenswrapper[4697]: I0220 17:58:32.766300 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:33 crc kubenswrapper[4697]: I0220 17:58:33.741602 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:34 crc kubenswrapper[4697]: I0220 17:58:34.876680 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:58:34 crc kubenswrapper[4697]: E0220 17:58:34.877200 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.252848 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9rnm"] Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.253862 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r9rnm" podUID="e41ea39e-c37c-44b9-8af5-344b32773c98" containerName="registry-server" containerID="cri-o://c728c54126ce607b96ef95eb0c95586937fc490e3bb8a282ae9a61b35f55801d" gracePeriod=2 Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.733503 4697 generic.go:334] "Generic (PLEG): container finished" podID="e41ea39e-c37c-44b9-8af5-344b32773c98" containerID="c728c54126ce607b96ef95eb0c95586937fc490e3bb8a282ae9a61b35f55801d" exitCode=0 Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.733572 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9rnm" event={"ID":"e41ea39e-c37c-44b9-8af5-344b32773c98","Type":"ContainerDied","Data":"c728c54126ce607b96ef95eb0c95586937fc490e3bb8a282ae9a61b35f55801d"} Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.733911 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9rnm" event={"ID":"e41ea39e-c37c-44b9-8af5-344b32773c98","Type":"ContainerDied","Data":"968a539a3b7dd494e3a59e22a004d9516ce8475dd16ebeb154a3ff2113883178"} Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.733932 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="968a539a3b7dd494e3a59e22a004d9516ce8475dd16ebeb154a3ff2113883178" Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.763188 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.870964 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e41ea39e-c37c-44b9-8af5-344b32773c98-utilities\") pod \"e41ea39e-c37c-44b9-8af5-344b32773c98\" (UID: \"e41ea39e-c37c-44b9-8af5-344b32773c98\") " Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.871611 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wbdd\" (UniqueName: \"kubernetes.io/projected/e41ea39e-c37c-44b9-8af5-344b32773c98-kube-api-access-5wbdd\") pod \"e41ea39e-c37c-44b9-8af5-344b32773c98\" (UID: \"e41ea39e-c37c-44b9-8af5-344b32773c98\") " Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.871942 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e41ea39e-c37c-44b9-8af5-344b32773c98-utilities" (OuterVolumeSpecName: "utilities") pod "e41ea39e-c37c-44b9-8af5-344b32773c98" (UID: "e41ea39e-c37c-44b9-8af5-344b32773c98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.872487 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e41ea39e-c37c-44b9-8af5-344b32773c98-catalog-content\") pod \"e41ea39e-c37c-44b9-8af5-344b32773c98\" (UID: \"e41ea39e-c37c-44b9-8af5-344b32773c98\") " Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.873352 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e41ea39e-c37c-44b9-8af5-344b32773c98-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.888106 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e41ea39e-c37c-44b9-8af5-344b32773c98-kube-api-access-5wbdd" (OuterVolumeSpecName: "kube-api-access-5wbdd") pod "e41ea39e-c37c-44b9-8af5-344b32773c98" (UID: "e41ea39e-c37c-44b9-8af5-344b32773c98"). InnerVolumeSpecName "kube-api-access-5wbdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.905266 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e41ea39e-c37c-44b9-8af5-344b32773c98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e41ea39e-c37c-44b9-8af5-344b32773c98" (UID: "e41ea39e-c37c-44b9-8af5-344b32773c98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.975729 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wbdd\" (UniqueName: \"kubernetes.io/projected/e41ea39e-c37c-44b9-8af5-344b32773c98-kube-api-access-5wbdd\") on node \"crc\" DevicePath \"\"" Feb 20 17:58:36 crc kubenswrapper[4697]: I0220 17:58:36.976074 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e41ea39e-c37c-44b9-8af5-344b32773c98-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 17:58:37 crc kubenswrapper[4697]: I0220 17:58:37.741252 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9rnm" Feb 20 17:58:37 crc kubenswrapper[4697]: I0220 17:58:37.775025 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9rnm"] Feb 20 17:58:37 crc kubenswrapper[4697]: I0220 17:58:37.785020 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9rnm"] Feb 20 17:58:38 crc kubenswrapper[4697]: I0220 17:58:38.893735 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e41ea39e-c37c-44b9-8af5-344b32773c98" path="/var/lib/kubelet/pods/e41ea39e-c37c-44b9-8af5-344b32773c98/volumes" Feb 20 17:58:47 crc kubenswrapper[4697]: I0220 17:58:47.877373 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:58:47 crc kubenswrapper[4697]: E0220 17:58:47.878415 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:58:58 crc kubenswrapper[4697]: I0220 17:58:58.878111 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:58:58 crc kubenswrapper[4697]: E0220 17:58:58.879041 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 17:59:06 crc kubenswrapper[4697]: I0220 17:59:06.015293 4697 generic.go:334] "Generic (PLEG): container finished" podID="e83acb01-3a91-4950-848f-d447679c0533" containerID="cd8359e22280ef738d40191afad5bb4cfcccdc79bf17785a29085c7a3e81b60c" exitCode=0 Feb 20 17:59:06 crc kubenswrapper[4697]: I0220 17:59:06.015399 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e83acb01-3a91-4950-848f-d447679c0533","Type":"ContainerDied","Data":"cd8359e22280ef738d40191afad5bb4cfcccdc79bf17785a29085c7a3e81b60c"} Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.431263 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.533309 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-ssh-key\") pod \"e83acb01-3a91-4950-848f-d447679c0533\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.533403 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e83acb01-3a91-4950-848f-d447679c0533-test-operator-ephemeral-workdir\") pod \"e83acb01-3a91-4950-848f-d447679c0533\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.533498 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxswp\" (UniqueName: \"kubernetes.io/projected/e83acb01-3a91-4950-848f-d447679c0533-kube-api-access-cxswp\") pod \"e83acb01-3a91-4950-848f-d447679c0533\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.533550 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-openstack-config-secret\") pod \"e83acb01-3a91-4950-848f-d447679c0533\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.533572 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e83acb01-3a91-4950-848f-d447679c0533-test-operator-ephemeral-temporary\") pod \"e83acb01-3a91-4950-848f-d447679c0533\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.533605 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-ca-certs\") pod \"e83acb01-3a91-4950-848f-d447679c0533\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.533647 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e83acb01-3a91-4950-848f-d447679c0533-config-data\") pod \"e83acb01-3a91-4950-848f-d447679c0533\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.533709 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e83acb01-3a91-4950-848f-d447679c0533-openstack-config\") pod \"e83acb01-3a91-4950-848f-d447679c0533\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.533793 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"e83acb01-3a91-4950-848f-d447679c0533\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.534249 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e83acb01-3a91-4950-848f-d447679c0533-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "e83acb01-3a91-4950-848f-d447679c0533" (UID: "e83acb01-3a91-4950-848f-d447679c0533"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.534814 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e83acb01-3a91-4950-848f-d447679c0533-config-data" (OuterVolumeSpecName: "config-data") pod "e83acb01-3a91-4950-848f-d447679c0533" (UID: "e83acb01-3a91-4950-848f-d447679c0533"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.539093 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83acb01-3a91-4950-848f-d447679c0533-kube-api-access-cxswp" (OuterVolumeSpecName: "kube-api-access-cxswp") pod "e83acb01-3a91-4950-848f-d447679c0533" (UID: "e83acb01-3a91-4950-848f-d447679c0533"). InnerVolumeSpecName "kube-api-access-cxswp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.542686 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "e83acb01-3a91-4950-848f-d447679c0533" (UID: "e83acb01-3a91-4950-848f-d447679c0533"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.546101 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e83acb01-3a91-4950-848f-d447679c0533-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "e83acb01-3a91-4950-848f-d447679c0533" (UID: "e83acb01-3a91-4950-848f-d447679c0533"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.574000 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e83acb01-3a91-4950-848f-d447679c0533" (UID: "e83acb01-3a91-4950-848f-d447679c0533"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.576922 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "e83acb01-3a91-4950-848f-d447679c0533" (UID: "e83acb01-3a91-4950-848f-d447679c0533"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:59:07 crc kubenswrapper[4697]: E0220 17:59:07.594949 4697 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-openstack-config-secret podName:e83acb01-3a91-4950-848f-d447679c0533 nodeName:}" failed. No retries permitted until 2026-02-20 17:59:08.09491822 +0000 UTC m=+5255.874963628 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-openstack-config-secret") pod "e83acb01-3a91-4950-848f-d447679c0533" (UID: "e83acb01-3a91-4950-848f-d447679c0533") : error deleting /var/lib/kubelet/pods/e83acb01-3a91-4950-848f-d447679c0533/volume-subpaths: remove /var/lib/kubelet/pods/e83acb01-3a91-4950-848f-d447679c0533/volume-subpaths: no such file or directory Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.595536 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e83acb01-3a91-4950-848f-d447679c0533-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e83acb01-3a91-4950-848f-d447679c0533" (UID: "e83acb01-3a91-4950-848f-d447679c0533"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.636385 4697 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e83acb01-3a91-4950-848f-d447679c0533-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.636471 4697 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.636490 4697 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.636504 4697 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e83acb01-3a91-4950-848f-d447679c0533-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.636520 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxswp\" (UniqueName: \"kubernetes.io/projected/e83acb01-3a91-4950-848f-d447679c0533-kube-api-access-cxswp\") on node \"crc\" DevicePath \"\"" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.636534 4697 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e83acb01-3a91-4950-848f-d447679c0533-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.636548 4697 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.636562 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e83acb01-3a91-4950-848f-d447679c0533-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.658940 4697 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 20 17:59:07 crc kubenswrapper[4697]: I0220 17:59:07.740793 4697 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 20 17:59:08 crc kubenswrapper[4697]: I0220 17:59:08.037310 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e83acb01-3a91-4950-848f-d447679c0533","Type":"ContainerDied","Data":"53204102fa0927784a76063a0c24718be54c80f47e4dc4059aea2842199eca8d"} Feb 20 17:59:08 crc kubenswrapper[4697]: I0220 17:59:08.037359 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53204102fa0927784a76063a0c24718be54c80f47e4dc4059aea2842199eca8d" Feb 20 17:59:08 crc kubenswrapper[4697]: I0220 17:59:08.037358 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 20 17:59:08 crc kubenswrapper[4697]: I0220 17:59:08.150374 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-openstack-config-secret\") pod \"e83acb01-3a91-4950-848f-d447679c0533\" (UID: \"e83acb01-3a91-4950-848f-d447679c0533\") " Feb 20 17:59:08 crc kubenswrapper[4697]: I0220 17:59:08.157637 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e83acb01-3a91-4950-848f-d447679c0533" (UID: "e83acb01-3a91-4950-848f-d447679c0533"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 17:59:08 crc kubenswrapper[4697]: I0220 17:59:08.254746 4697 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e83acb01-3a91-4950-848f-d447679c0533-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 20 17:59:12 crc kubenswrapper[4697]: I0220 17:59:12.883693 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 17:59:14 crc kubenswrapper[4697]: I0220 17:59:14.095825 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"f70c8b9f6708d91f911ce6e63dfc4940e276016d2a4151ff12953e15e5419aa9"} Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.479757 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 20 17:59:19 crc kubenswrapper[4697]: E0220 17:59:19.480902 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41ea39e-c37c-44b9-8af5-344b32773c98" containerName="registry-server" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.480917 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41ea39e-c37c-44b9-8af5-344b32773c98" containerName="registry-server" Feb 20 17:59:19 crc kubenswrapper[4697]: E0220 17:59:19.480930 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41ea39e-c37c-44b9-8af5-344b32773c98" containerName="extract-utilities" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.480936 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41ea39e-c37c-44b9-8af5-344b32773c98" containerName="extract-utilities" Feb 20 17:59:19 crc kubenswrapper[4697]: E0220 17:59:19.480945 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41ea39e-c37c-44b9-8af5-344b32773c98" containerName="extract-content" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.480952 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41ea39e-c37c-44b9-8af5-344b32773c98" containerName="extract-content" Feb 20 17:59:19 crc kubenswrapper[4697]: E0220 17:59:19.480965 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83acb01-3a91-4950-848f-d447679c0533" containerName="tempest-tests-tempest-tests-runner" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.480971 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83acb01-3a91-4950-848f-d447679c0533" containerName="tempest-tests-tempest-tests-runner" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.481192 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83acb01-3a91-4950-848f-d447679c0533" containerName="tempest-tests-tempest-tests-runner" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.481206 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41ea39e-c37c-44b9-8af5-344b32773c98" containerName="registry-server" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.481933 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.492594 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-n2vrd" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.501630 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.602707 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hcsk\" (UniqueName: \"kubernetes.io/projected/1b384505-09b4-453d-9418-b3116dfe429e-kube-api-access-8hcsk\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1b384505-09b4-453d-9418-b3116dfe429e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.603214 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1b384505-09b4-453d-9418-b3116dfe429e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.705986 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hcsk\" (UniqueName: \"kubernetes.io/projected/1b384505-09b4-453d-9418-b3116dfe429e-kube-api-access-8hcsk\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1b384505-09b4-453d-9418-b3116dfe429e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.706047 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1b384505-09b4-453d-9418-b3116dfe429e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.706594 4697 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1b384505-09b4-453d-9418-b3116dfe429e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.740841 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hcsk\" (UniqueName: \"kubernetes.io/projected/1b384505-09b4-453d-9418-b3116dfe429e-kube-api-access-8hcsk\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1b384505-09b4-453d-9418-b3116dfe429e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.759323 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1b384505-09b4-453d-9418-b3116dfe429e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 17:59:19 crc kubenswrapper[4697]: I0220 17:59:19.833970 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 17:59:20 crc kubenswrapper[4697]: I0220 17:59:20.262026 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 20 17:59:21 crc kubenswrapper[4697]: I0220 17:59:21.174389 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1b384505-09b4-453d-9418-b3116dfe429e","Type":"ContainerStarted","Data":"c0cab1cab5e2d13a34c60c919d9a3c353315f2450bbdeeb514d0aaa6c2e8dad2"} Feb 20 17:59:22 crc kubenswrapper[4697]: I0220 17:59:22.187129 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1b384505-09b4-453d-9418-b3116dfe429e","Type":"ContainerStarted","Data":"dbd39cee8c0a58f248a4bc0ab539c6ac826353b7ae474da277f684a06bb8367f"} Feb 20 17:59:22 crc kubenswrapper[4697]: I0220 17:59:22.214250 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.159197454 podStartE2EDuration="3.214213881s" podCreationTimestamp="2026-02-20 17:59:19 +0000 UTC" firstStartedPulling="2026-02-20 17:59:20.268680619 +0000 UTC m=+5268.048726027" lastFinishedPulling="2026-02-20 17:59:21.323697006 +0000 UTC m=+5269.103742454" observedRunningTime="2026-02-20 17:59:22.200565648 +0000 UTC m=+5269.980611086" watchObservedRunningTime="2026-02-20 17:59:22.214213881 +0000 UTC m=+5269.994259319" Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.237992 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w8dxn/must-gather-xb7hq"] Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.240115 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/must-gather-xb7hq" Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.244822 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-w8dxn"/"openshift-service-ca.crt" Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.244870 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-w8dxn"/"default-dockercfg-b6c42" Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.244976 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-w8dxn"/"kube-root-ca.crt" Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.251683 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w8dxn/must-gather-xb7hq"] Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.368713 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/35308998-f1d9-472f-b649-2a7623cf6987-must-gather-output\") pod \"must-gather-xb7hq\" (UID: \"35308998-f1d9-472f-b649-2a7623cf6987\") " pod="openshift-must-gather-w8dxn/must-gather-xb7hq" Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.368814 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlpqs\" (UniqueName: \"kubernetes.io/projected/35308998-f1d9-472f-b649-2a7623cf6987-kube-api-access-zlpqs\") pod \"must-gather-xb7hq\" (UID: \"35308998-f1d9-472f-b649-2a7623cf6987\") " pod="openshift-must-gather-w8dxn/must-gather-xb7hq" Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.470956 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/35308998-f1d9-472f-b649-2a7623cf6987-must-gather-output\") pod \"must-gather-xb7hq\" (UID: \"35308998-f1d9-472f-b649-2a7623cf6987\") " pod="openshift-must-gather-w8dxn/must-gather-xb7hq" Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.471081 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlpqs\" (UniqueName: \"kubernetes.io/projected/35308998-f1d9-472f-b649-2a7623cf6987-kube-api-access-zlpqs\") pod \"must-gather-xb7hq\" (UID: \"35308998-f1d9-472f-b649-2a7623cf6987\") " pod="openshift-must-gather-w8dxn/must-gather-xb7hq" Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.471598 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/35308998-f1d9-472f-b649-2a7623cf6987-must-gather-output\") pod \"must-gather-xb7hq\" (UID: \"35308998-f1d9-472f-b649-2a7623cf6987\") " pod="openshift-must-gather-w8dxn/must-gather-xb7hq" Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.489574 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlpqs\" (UniqueName: \"kubernetes.io/projected/35308998-f1d9-472f-b649-2a7623cf6987-kube-api-access-zlpqs\") pod \"must-gather-xb7hq\" (UID: \"35308998-f1d9-472f-b649-2a7623cf6987\") " pod="openshift-must-gather-w8dxn/must-gather-xb7hq" Feb 20 17:59:49 crc kubenswrapper[4697]: I0220 17:59:49.559888 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/must-gather-xb7hq" Feb 20 17:59:50 crc kubenswrapper[4697]: I0220 17:59:50.012206 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w8dxn/must-gather-xb7hq"] Feb 20 17:59:50 crc kubenswrapper[4697]: I0220 17:59:50.496265 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8dxn/must-gather-xb7hq" event={"ID":"35308998-f1d9-472f-b649-2a7623cf6987","Type":"ContainerStarted","Data":"036f3378e1aace149aff2439516e8d5a550c006acc98368fc90ef66a2c827651"} Feb 20 17:59:57 crc kubenswrapper[4697]: I0220 17:59:57.582737 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8dxn/must-gather-xb7hq" event={"ID":"35308998-f1d9-472f-b649-2a7623cf6987","Type":"ContainerStarted","Data":"19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5"} Feb 20 17:59:58 crc kubenswrapper[4697]: I0220 17:59:58.591963 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8dxn/must-gather-xb7hq" event={"ID":"35308998-f1d9-472f-b649-2a7623cf6987","Type":"ContainerStarted","Data":"cad8c9e025f135fd9863aea41a45b054db6592416854aa82c98cb30f3cbb2bd4"} Feb 20 17:59:58 crc kubenswrapper[4697]: I0220 17:59:58.612512 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w8dxn/must-gather-xb7hq" podStartSLOduration=2.369475109 podStartE2EDuration="9.612496345s" podCreationTimestamp="2026-02-20 17:59:49 +0000 UTC" firstStartedPulling="2026-02-20 17:59:50.022163231 +0000 UTC m=+5297.802208639" lastFinishedPulling="2026-02-20 17:59:57.265184457 +0000 UTC m=+5305.045229875" observedRunningTime="2026-02-20 17:59:58.609863981 +0000 UTC m=+5306.389909389" watchObservedRunningTime="2026-02-20 17:59:58.612496345 +0000 UTC m=+5306.392541753" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.160317 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8"] Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.161997 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.165251 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.165498 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.190393 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8"] Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.204985 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-secret-volume\") pod \"collect-profiles-29526840-gxqc8\" (UID: \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.205242 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-config-volume\") pod \"collect-profiles-29526840-gxqc8\" (UID: \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.205470 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbxfp\" (UniqueName: \"kubernetes.io/projected/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-kube-api-access-cbxfp\") pod \"collect-profiles-29526840-gxqc8\" (UID: \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.306959 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbxfp\" (UniqueName: \"kubernetes.io/projected/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-kube-api-access-cbxfp\") pod \"collect-profiles-29526840-gxqc8\" (UID: \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.307046 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-secret-volume\") pod \"collect-profiles-29526840-gxqc8\" (UID: \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.307069 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-config-volume\") pod \"collect-profiles-29526840-gxqc8\" (UID: \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.308139 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-config-volume\") pod \"collect-profiles-29526840-gxqc8\" (UID: \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.314124 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-secret-volume\") pod \"collect-profiles-29526840-gxqc8\" (UID: \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.323895 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbxfp\" (UniqueName: \"kubernetes.io/projected/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-kube-api-access-cbxfp\") pod \"collect-profiles-29526840-gxqc8\" (UID: \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.491422 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:00 crc kubenswrapper[4697]: I0220 18:00:00.799155 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8"] Feb 20 18:00:01 crc kubenswrapper[4697]: I0220 18:00:01.622722 4697 generic.go:334] "Generic (PLEG): container finished" podID="eecf6221-4c8e-4998-aa1b-60b2fdf4c79b" containerID="c24300afcdf35362fd6fa63aa77eb00f958e43754ed462cac026392b5af058a3" exitCode=0 Feb 20 18:00:01 crc kubenswrapper[4697]: I0220 18:00:01.622824 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" event={"ID":"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b","Type":"ContainerDied","Data":"c24300afcdf35362fd6fa63aa77eb00f958e43754ed462cac026392b5af058a3"} Feb 20 18:00:01 crc kubenswrapper[4697]: I0220 18:00:01.623210 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" event={"ID":"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b","Type":"ContainerStarted","Data":"f8512148906140dca4fe42a10a3bbde222875382aa2e45d02398a68c35510a4d"} Feb 20 18:00:01 crc kubenswrapper[4697]: I0220 18:00:01.677932 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w8dxn/crc-debug-jrdtv"] Feb 20 18:00:01 crc kubenswrapper[4697]: I0220 18:00:01.679282 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" Feb 20 18:00:01 crc kubenswrapper[4697]: I0220 18:00:01.740918 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjn6h\" (UniqueName: \"kubernetes.io/projected/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e-kube-api-access-tjn6h\") pod \"crc-debug-jrdtv\" (UID: \"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e\") " pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" Feb 20 18:00:01 crc kubenswrapper[4697]: I0220 18:00:01.740976 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e-host\") pod \"crc-debug-jrdtv\" (UID: \"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e\") " pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" Feb 20 18:00:01 crc kubenswrapper[4697]: I0220 18:00:01.843562 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjn6h\" (UniqueName: \"kubernetes.io/projected/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e-kube-api-access-tjn6h\") pod \"crc-debug-jrdtv\" (UID: \"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e\") " pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" Feb 20 18:00:01 crc kubenswrapper[4697]: I0220 18:00:01.843620 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e-host\") pod \"crc-debug-jrdtv\" (UID: \"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e\") " pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" Feb 20 18:00:01 crc kubenswrapper[4697]: I0220 18:00:01.843849 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e-host\") pod \"crc-debug-jrdtv\" (UID: \"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e\") " pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" Feb 20 18:00:01 crc kubenswrapper[4697]: I0220 18:00:01.870144 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjn6h\" (UniqueName: \"kubernetes.io/projected/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e-kube-api-access-tjn6h\") pod \"crc-debug-jrdtv\" (UID: \"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e\") " pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" Feb 20 18:00:01 crc kubenswrapper[4697]: I0220 18:00:01.999013 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" Feb 20 18:00:02 crc kubenswrapper[4697]: W0220 18:00:02.383806 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01bd3db3_68ef_4e2d_9bc7_5c41fbe1213e.slice/crio-d5d4b20c2091cee6e4873b230fc7d2112940a261de9ad16c57a9c9319f83485e WatchSource:0}: Error finding container d5d4b20c2091cee6e4873b230fc7d2112940a261de9ad16c57a9c9319f83485e: Status 404 returned error can't find the container with id d5d4b20c2091cee6e4873b230fc7d2112940a261de9ad16c57a9c9319f83485e Feb 20 18:00:02 crc kubenswrapper[4697]: I0220 18:00:02.639876 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" event={"ID":"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e","Type":"ContainerStarted","Data":"d5d4b20c2091cee6e4873b230fc7d2112940a261de9ad16c57a9c9319f83485e"} Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.005232 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.064957 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-secret-volume\") pod \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\" (UID: \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\") " Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.065048 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbxfp\" (UniqueName: \"kubernetes.io/projected/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-kube-api-access-cbxfp\") pod \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\" (UID: \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\") " Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.065078 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-config-volume\") pod \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\" (UID: \"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b\") " Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.067194 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-config-volume" (OuterVolumeSpecName: "config-volume") pod "eecf6221-4c8e-4998-aa1b-60b2fdf4c79b" (UID: "eecf6221-4c8e-4998-aa1b-60b2fdf4c79b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.075558 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eecf6221-4c8e-4998-aa1b-60b2fdf4c79b" (UID: "eecf6221-4c8e-4998-aa1b-60b2fdf4c79b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.078929 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-kube-api-access-cbxfp" (OuterVolumeSpecName: "kube-api-access-cbxfp") pod "eecf6221-4c8e-4998-aa1b-60b2fdf4c79b" (UID: "eecf6221-4c8e-4998-aa1b-60b2fdf4c79b"). InnerVolumeSpecName "kube-api-access-cbxfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.169918 4697 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.169970 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbxfp\" (UniqueName: \"kubernetes.io/projected/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-kube-api-access-cbxfp\") on node \"crc\" DevicePath \"\"" Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.169983 4697 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eecf6221-4c8e-4998-aa1b-60b2fdf4c79b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.650422 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" event={"ID":"eecf6221-4c8e-4998-aa1b-60b2fdf4c79b","Type":"ContainerDied","Data":"f8512148906140dca4fe42a10a3bbde222875382aa2e45d02398a68c35510a4d"} Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.651856 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8512148906140dca4fe42a10a3bbde222875382aa2e45d02398a68c35510a4d" Feb 20 18:00:03 crc kubenswrapper[4697]: I0220 18:00:03.650495 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526840-gxqc8" Feb 20 18:00:04 crc kubenswrapper[4697]: I0220 18:00:04.092989 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv"] Feb 20 18:00:04 crc kubenswrapper[4697]: I0220 18:00:04.102292 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526795-ttvtv"] Feb 20 18:00:04 crc kubenswrapper[4697]: I0220 18:00:04.891202 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4f83b20-b4f6-4b44-87f5-12ace92d12fb" path="/var/lib/kubelet/pods/e4f83b20-b4f6-4b44-87f5-12ace92d12fb/volumes" Feb 20 18:00:14 crc kubenswrapper[4697]: I0220 18:00:14.758175 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" event={"ID":"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e","Type":"ContainerStarted","Data":"c73c908e853a241ccffe20069b4c81d03239e01142e5330a4b874e73484155bb"} Feb 20 18:00:14 crc kubenswrapper[4697]: I0220 18:00:14.789999 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" podStartSLOduration=2.693424096 podStartE2EDuration="13.789976969s" podCreationTimestamp="2026-02-20 18:00:01 +0000 UTC" firstStartedPulling="2026-02-20 18:00:02.38595816 +0000 UTC m=+5310.166003578" lastFinishedPulling="2026-02-20 18:00:13.482511043 +0000 UTC m=+5321.262556451" observedRunningTime="2026-02-20 18:00:14.773416965 +0000 UTC m=+5322.553462373" watchObservedRunningTime="2026-02-20 18:00:14.789976969 +0000 UTC m=+5322.570022387" Feb 20 18:00:47 crc kubenswrapper[4697]: I0220 18:00:47.684555 4697 scope.go:117] "RemoveContainer" containerID="f5e3011ef626e46ac458ed312e9ffb5eefeffe426ec276f3f86f13e9eb0a3a72" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.149706 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29526841-57prn"] Feb 20 18:01:00 crc kubenswrapper[4697]: E0220 18:01:00.150698 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eecf6221-4c8e-4998-aa1b-60b2fdf4c79b" containerName="collect-profiles" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.150713 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="eecf6221-4c8e-4998-aa1b-60b2fdf4c79b" containerName="collect-profiles" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.150944 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="eecf6221-4c8e-4998-aa1b-60b2fdf4c79b" containerName="collect-profiles" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.151700 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.160033 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29526841-57prn"] Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.264750 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-config-data\") pod \"keystone-cron-29526841-57prn\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.264870 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5b49\" (UniqueName: \"kubernetes.io/projected/594dde87-85f5-42d7-affc-466eb2311afc-kube-api-access-q5b49\") pod \"keystone-cron-29526841-57prn\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.264937 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-combined-ca-bundle\") pod \"keystone-cron-29526841-57prn\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.265080 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-fernet-keys\") pod \"keystone-cron-29526841-57prn\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.367365 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-config-data\") pod \"keystone-cron-29526841-57prn\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.367452 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5b49\" (UniqueName: \"kubernetes.io/projected/594dde87-85f5-42d7-affc-466eb2311afc-kube-api-access-q5b49\") pod \"keystone-cron-29526841-57prn\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.367498 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-combined-ca-bundle\") pod \"keystone-cron-29526841-57prn\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.367537 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-fernet-keys\") pod \"keystone-cron-29526841-57prn\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.383199 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-combined-ca-bundle\") pod \"keystone-cron-29526841-57prn\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.383463 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-fernet-keys\") pod \"keystone-cron-29526841-57prn\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.383986 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-config-data\") pod \"keystone-cron-29526841-57prn\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.392285 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5b49\" (UniqueName: \"kubernetes.io/projected/594dde87-85f5-42d7-affc-466eb2311afc-kube-api-access-q5b49\") pod \"keystone-cron-29526841-57prn\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:00 crc kubenswrapper[4697]: I0220 18:01:00.484037 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:01 crc kubenswrapper[4697]: I0220 18:01:01.035680 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29526841-57prn"] Feb 20 18:01:01 crc kubenswrapper[4697]: I0220 18:01:01.193703 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526841-57prn" event={"ID":"594dde87-85f5-42d7-affc-466eb2311afc","Type":"ContainerStarted","Data":"70738eece1a673fc71af59e44c49188137cd5b083c81db1061a1dd8b18932e03"} Feb 20 18:01:01 crc kubenswrapper[4697]: I0220 18:01:01.194902 4697 generic.go:334] "Generic (PLEG): container finished" podID="01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e" containerID="c73c908e853a241ccffe20069b4c81d03239e01142e5330a4b874e73484155bb" exitCode=0 Feb 20 18:01:01 crc kubenswrapper[4697]: I0220 18:01:01.194951 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" event={"ID":"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e","Type":"ContainerDied","Data":"c73c908e853a241ccffe20069b4c81d03239e01142e5330a4b874e73484155bb"} Feb 20 18:01:02 crc kubenswrapper[4697]: I0220 18:01:02.225764 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526841-57prn" event={"ID":"594dde87-85f5-42d7-affc-466eb2311afc","Type":"ContainerStarted","Data":"27d83589a9d41c81c0c37ddbef14dc98a6f3368ac5b462f5aac3c33254bbb965"} Feb 20 18:01:02 crc kubenswrapper[4697]: I0220 18:01:02.259565 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29526841-57prn" podStartSLOduration=2.259539611 podStartE2EDuration="2.259539611s" podCreationTimestamp="2026-02-20 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 18:01:02.249029195 +0000 UTC m=+5370.029074613" watchObservedRunningTime="2026-02-20 18:01:02.259539611 +0000 UTC m=+5370.039585029" Feb 20 18:01:02 crc kubenswrapper[4697]: I0220 18:01:02.330481 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" Feb 20 18:01:02 crc kubenswrapper[4697]: I0220 18:01:02.375015 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w8dxn/crc-debug-jrdtv"] Feb 20 18:01:02 crc kubenswrapper[4697]: I0220 18:01:02.385075 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w8dxn/crc-debug-jrdtv"] Feb 20 18:01:02 crc kubenswrapper[4697]: I0220 18:01:02.407233 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjn6h\" (UniqueName: \"kubernetes.io/projected/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e-kube-api-access-tjn6h\") pod \"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e\" (UID: \"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e\") " Feb 20 18:01:02 crc kubenswrapper[4697]: I0220 18:01:02.407363 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e-host\") pod \"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e\" (UID: \"01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e\") " Feb 20 18:01:02 crc kubenswrapper[4697]: I0220 18:01:02.407739 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e-host" (OuterVolumeSpecName: "host") pod "01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e" (UID: "01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 18:01:02 crc kubenswrapper[4697]: I0220 18:01:02.423220 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e-kube-api-access-tjn6h" (OuterVolumeSpecName: "kube-api-access-tjn6h") pod "01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e" (UID: "01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e"). InnerVolumeSpecName "kube-api-access-tjn6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:01:02 crc kubenswrapper[4697]: I0220 18:01:02.510215 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjn6h\" (UniqueName: \"kubernetes.io/projected/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e-kube-api-access-tjn6h\") on node \"crc\" DevicePath \"\"" Feb 20 18:01:02 crc kubenswrapper[4697]: I0220 18:01:02.510258 4697 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e-host\") on node \"crc\" DevicePath \"\"" Feb 20 18:01:02 crc kubenswrapper[4697]: I0220 18:01:02.890466 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e" path="/var/lib/kubelet/pods/01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e/volumes" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.235894 4697 scope.go:117] "RemoveContainer" containerID="c73c908e853a241ccffe20069b4c81d03239e01142e5330a4b874e73484155bb" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.235925 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/crc-debug-jrdtv" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.658513 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w8dxn/crc-debug-bllcs"] Feb 20 18:01:03 crc kubenswrapper[4697]: E0220 18:01:03.658901 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e" containerName="container-00" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.658916 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e" containerName="container-00" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.659112 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bd3db3-68ef-4e2d-9bc7-5c41fbe1213e" containerName="container-00" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.659853 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/crc-debug-bllcs" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.734880 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da55b971-ee64-4312-9812-58a095a75bb4-host\") pod \"crc-debug-bllcs\" (UID: \"da55b971-ee64-4312-9812-58a095a75bb4\") " pod="openshift-must-gather-w8dxn/crc-debug-bllcs" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.735056 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bhsl\" (UniqueName: \"kubernetes.io/projected/da55b971-ee64-4312-9812-58a095a75bb4-kube-api-access-7bhsl\") pod \"crc-debug-bllcs\" (UID: \"da55b971-ee64-4312-9812-58a095a75bb4\") " pod="openshift-must-gather-w8dxn/crc-debug-bllcs" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.837239 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da55b971-ee64-4312-9812-58a095a75bb4-host\") pod \"crc-debug-bllcs\" (UID: \"da55b971-ee64-4312-9812-58a095a75bb4\") " pod="openshift-must-gather-w8dxn/crc-debug-bllcs" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.837328 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bhsl\" (UniqueName: \"kubernetes.io/projected/da55b971-ee64-4312-9812-58a095a75bb4-kube-api-access-7bhsl\") pod \"crc-debug-bllcs\" (UID: \"da55b971-ee64-4312-9812-58a095a75bb4\") " pod="openshift-must-gather-w8dxn/crc-debug-bllcs" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.837361 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da55b971-ee64-4312-9812-58a095a75bb4-host\") pod \"crc-debug-bllcs\" (UID: \"da55b971-ee64-4312-9812-58a095a75bb4\") " pod="openshift-must-gather-w8dxn/crc-debug-bllcs" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.859157 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bhsl\" (UniqueName: \"kubernetes.io/projected/da55b971-ee64-4312-9812-58a095a75bb4-kube-api-access-7bhsl\") pod \"crc-debug-bllcs\" (UID: \"da55b971-ee64-4312-9812-58a095a75bb4\") " pod="openshift-must-gather-w8dxn/crc-debug-bllcs" Feb 20 18:01:03 crc kubenswrapper[4697]: I0220 18:01:03.981843 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/crc-debug-bllcs" Feb 20 18:01:04 crc kubenswrapper[4697]: W0220 18:01:04.012765 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda55b971_ee64_4312_9812_58a095a75bb4.slice/crio-143860da648c0704cbf7cabc6f57de3f6e37c4f281fc7a5f7e87f046e15d032f WatchSource:0}: Error finding container 143860da648c0704cbf7cabc6f57de3f6e37c4f281fc7a5f7e87f046e15d032f: Status 404 returned error can't find the container with id 143860da648c0704cbf7cabc6f57de3f6e37c4f281fc7a5f7e87f046e15d032f Feb 20 18:01:04 crc kubenswrapper[4697]: I0220 18:01:04.246364 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8dxn/crc-debug-bllcs" event={"ID":"da55b971-ee64-4312-9812-58a095a75bb4","Type":"ContainerStarted","Data":"f393be63eee21acf54654a87ae6cb1b064e2a80a4cc7a6789953ad8827a53345"} Feb 20 18:01:04 crc kubenswrapper[4697]: I0220 18:01:04.246414 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8dxn/crc-debug-bllcs" event={"ID":"da55b971-ee64-4312-9812-58a095a75bb4","Type":"ContainerStarted","Data":"143860da648c0704cbf7cabc6f57de3f6e37c4f281fc7a5f7e87f046e15d032f"} Feb 20 18:01:04 crc kubenswrapper[4697]: I0220 18:01:04.271422 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w8dxn/crc-debug-bllcs" podStartSLOduration=1.271404402 podStartE2EDuration="1.271404402s" podCreationTimestamp="2026-02-20 18:01:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 18:01:04.260093236 +0000 UTC m=+5372.040138644" watchObservedRunningTime="2026-02-20 18:01:04.271404402 +0000 UTC m=+5372.051449810" Feb 20 18:01:05 crc kubenswrapper[4697]: I0220 18:01:05.257199 4697 generic.go:334] "Generic (PLEG): container finished" podID="594dde87-85f5-42d7-affc-466eb2311afc" containerID="27d83589a9d41c81c0c37ddbef14dc98a6f3368ac5b462f5aac3c33254bbb965" exitCode=0 Feb 20 18:01:05 crc kubenswrapper[4697]: I0220 18:01:05.257460 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526841-57prn" event={"ID":"594dde87-85f5-42d7-affc-466eb2311afc","Type":"ContainerDied","Data":"27d83589a9d41c81c0c37ddbef14dc98a6f3368ac5b462f5aac3c33254bbb965"} Feb 20 18:01:05 crc kubenswrapper[4697]: I0220 18:01:05.258865 4697 generic.go:334] "Generic (PLEG): container finished" podID="da55b971-ee64-4312-9812-58a095a75bb4" containerID="f393be63eee21acf54654a87ae6cb1b064e2a80a4cc7a6789953ad8827a53345" exitCode=0 Feb 20 18:01:05 crc kubenswrapper[4697]: I0220 18:01:05.258905 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8dxn/crc-debug-bllcs" event={"ID":"da55b971-ee64-4312-9812-58a095a75bb4","Type":"ContainerDied","Data":"f393be63eee21acf54654a87ae6cb1b064e2a80a4cc7a6789953ad8827a53345"} Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.395272 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/crc-debug-bllcs" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.493243 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da55b971-ee64-4312-9812-58a095a75bb4-host\") pod \"da55b971-ee64-4312-9812-58a095a75bb4\" (UID: \"da55b971-ee64-4312-9812-58a095a75bb4\") " Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.493359 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bhsl\" (UniqueName: \"kubernetes.io/projected/da55b971-ee64-4312-9812-58a095a75bb4-kube-api-access-7bhsl\") pod \"da55b971-ee64-4312-9812-58a095a75bb4\" (UID: \"da55b971-ee64-4312-9812-58a095a75bb4\") " Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.493493 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da55b971-ee64-4312-9812-58a095a75bb4-host" (OuterVolumeSpecName: "host") pod "da55b971-ee64-4312-9812-58a095a75bb4" (UID: "da55b971-ee64-4312-9812-58a095a75bb4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.493740 4697 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da55b971-ee64-4312-9812-58a095a75bb4-host\") on node \"crc\" DevicePath \"\"" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.500626 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da55b971-ee64-4312-9812-58a095a75bb4-kube-api-access-7bhsl" (OuterVolumeSpecName: "kube-api-access-7bhsl") pod "da55b971-ee64-4312-9812-58a095a75bb4" (UID: "da55b971-ee64-4312-9812-58a095a75bb4"). InnerVolumeSpecName "kube-api-access-7bhsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.520171 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w8dxn/crc-debug-bllcs"] Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.529083 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w8dxn/crc-debug-bllcs"] Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.591462 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.595813 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bhsl\" (UniqueName: \"kubernetes.io/projected/da55b971-ee64-4312-9812-58a095a75bb4-kube-api-access-7bhsl\") on node \"crc\" DevicePath \"\"" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.697603 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5b49\" (UniqueName: \"kubernetes.io/projected/594dde87-85f5-42d7-affc-466eb2311afc-kube-api-access-q5b49\") pod \"594dde87-85f5-42d7-affc-466eb2311afc\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.697966 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-fernet-keys\") pod \"594dde87-85f5-42d7-affc-466eb2311afc\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.697999 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-config-data\") pod \"594dde87-85f5-42d7-affc-466eb2311afc\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.698048 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-combined-ca-bundle\") pod \"594dde87-85f5-42d7-affc-466eb2311afc\" (UID: \"594dde87-85f5-42d7-affc-466eb2311afc\") " Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.705667 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594dde87-85f5-42d7-affc-466eb2311afc-kube-api-access-q5b49" (OuterVolumeSpecName: "kube-api-access-q5b49") pod "594dde87-85f5-42d7-affc-466eb2311afc" (UID: "594dde87-85f5-42d7-affc-466eb2311afc"). InnerVolumeSpecName "kube-api-access-q5b49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.712640 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "594dde87-85f5-42d7-affc-466eb2311afc" (UID: "594dde87-85f5-42d7-affc-466eb2311afc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.734577 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "594dde87-85f5-42d7-affc-466eb2311afc" (UID: "594dde87-85f5-42d7-affc-466eb2311afc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.772405 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-config-data" (OuterVolumeSpecName: "config-data") pod "594dde87-85f5-42d7-affc-466eb2311afc" (UID: "594dde87-85f5-42d7-affc-466eb2311afc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.799820 4697 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.799848 4697 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.799858 4697 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594dde87-85f5-42d7-affc-466eb2311afc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.799868 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5b49\" (UniqueName: \"kubernetes.io/projected/594dde87-85f5-42d7-affc-466eb2311afc-kube-api-access-q5b49\") on node \"crc\" DevicePath \"\"" Feb 20 18:01:06 crc kubenswrapper[4697]: I0220 18:01:06.889902 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da55b971-ee64-4312-9812-58a095a75bb4" path="/var/lib/kubelet/pods/da55b971-ee64-4312-9812-58a095a75bb4/volumes" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.329903 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526841-57prn" event={"ID":"594dde87-85f5-42d7-affc-466eb2311afc","Type":"ContainerDied","Data":"70738eece1a673fc71af59e44c49188137cd5b083c81db1061a1dd8b18932e03"} Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.329964 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70738eece1a673fc71af59e44c49188137cd5b083c81db1061a1dd8b18932e03" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.329929 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526841-57prn" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.334152 4697 scope.go:117] "RemoveContainer" containerID="f393be63eee21acf54654a87ae6cb1b064e2a80a4cc7a6789953ad8827a53345" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.334329 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/crc-debug-bllcs" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.663498 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w8dxn/crc-debug-z8lbh"] Feb 20 18:01:07 crc kubenswrapper[4697]: E0220 18:01:07.663909 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da55b971-ee64-4312-9812-58a095a75bb4" containerName="container-00" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.663921 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="da55b971-ee64-4312-9812-58a095a75bb4" containerName="container-00" Feb 20 18:01:07 crc kubenswrapper[4697]: E0220 18:01:07.663941 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594dde87-85f5-42d7-affc-466eb2311afc" containerName="keystone-cron" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.663947 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="594dde87-85f5-42d7-affc-466eb2311afc" containerName="keystone-cron" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.664117 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="594dde87-85f5-42d7-affc-466eb2311afc" containerName="keystone-cron" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.664129 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="da55b971-ee64-4312-9812-58a095a75bb4" containerName="container-00" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.664767 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/crc-debug-z8lbh" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.723401 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n78j4\" (UniqueName: \"kubernetes.io/projected/3015a78a-8332-427e-8840-754a04e38f6d-kube-api-access-n78j4\") pod \"crc-debug-z8lbh\" (UID: \"3015a78a-8332-427e-8840-754a04e38f6d\") " pod="openshift-must-gather-w8dxn/crc-debug-z8lbh" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.723519 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3015a78a-8332-427e-8840-754a04e38f6d-host\") pod \"crc-debug-z8lbh\" (UID: \"3015a78a-8332-427e-8840-754a04e38f6d\") " pod="openshift-must-gather-w8dxn/crc-debug-z8lbh" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.830203 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n78j4\" (UniqueName: \"kubernetes.io/projected/3015a78a-8332-427e-8840-754a04e38f6d-kube-api-access-n78j4\") pod \"crc-debug-z8lbh\" (UID: \"3015a78a-8332-427e-8840-754a04e38f6d\") " pod="openshift-must-gather-w8dxn/crc-debug-z8lbh" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.830919 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3015a78a-8332-427e-8840-754a04e38f6d-host\") pod \"crc-debug-z8lbh\" (UID: \"3015a78a-8332-427e-8840-754a04e38f6d\") " pod="openshift-must-gather-w8dxn/crc-debug-z8lbh" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.831300 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3015a78a-8332-427e-8840-754a04e38f6d-host\") pod \"crc-debug-z8lbh\" (UID: \"3015a78a-8332-427e-8840-754a04e38f6d\") " pod="openshift-must-gather-w8dxn/crc-debug-z8lbh" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.862379 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n78j4\" (UniqueName: \"kubernetes.io/projected/3015a78a-8332-427e-8840-754a04e38f6d-kube-api-access-n78j4\") pod \"crc-debug-z8lbh\" (UID: \"3015a78a-8332-427e-8840-754a04e38f6d\") " pod="openshift-must-gather-w8dxn/crc-debug-z8lbh" Feb 20 18:01:07 crc kubenswrapper[4697]: I0220 18:01:07.979359 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/crc-debug-z8lbh" Feb 20 18:01:08 crc kubenswrapper[4697]: I0220 18:01:08.348883 4697 generic.go:334] "Generic (PLEG): container finished" podID="3015a78a-8332-427e-8840-754a04e38f6d" containerID="e15aa9a416adf3432b740f3e987b44ae71ebd2fe20a33a177e86082ce29bcff6" exitCode=0 Feb 20 18:01:08 crc kubenswrapper[4697]: I0220 18:01:08.349054 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8dxn/crc-debug-z8lbh" event={"ID":"3015a78a-8332-427e-8840-754a04e38f6d","Type":"ContainerDied","Data":"e15aa9a416adf3432b740f3e987b44ae71ebd2fe20a33a177e86082ce29bcff6"} Feb 20 18:01:08 crc kubenswrapper[4697]: I0220 18:01:08.349259 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8dxn/crc-debug-z8lbh" event={"ID":"3015a78a-8332-427e-8840-754a04e38f6d","Type":"ContainerStarted","Data":"6c039d36f1a124112efd64ec37b9a7cb57fc15c682a504a6cc3d47a7192244ee"} Feb 20 18:01:08 crc kubenswrapper[4697]: I0220 18:01:08.389791 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w8dxn/crc-debug-z8lbh"] Feb 20 18:01:08 crc kubenswrapper[4697]: I0220 18:01:08.397412 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w8dxn/crc-debug-z8lbh"] Feb 20 18:01:09 crc kubenswrapper[4697]: I0220 18:01:09.689729 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/crc-debug-z8lbh" Feb 20 18:01:09 crc kubenswrapper[4697]: I0220 18:01:09.868556 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3015a78a-8332-427e-8840-754a04e38f6d-host\") pod \"3015a78a-8332-427e-8840-754a04e38f6d\" (UID: \"3015a78a-8332-427e-8840-754a04e38f6d\") " Feb 20 18:01:09 crc kubenswrapper[4697]: I0220 18:01:09.869050 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n78j4\" (UniqueName: \"kubernetes.io/projected/3015a78a-8332-427e-8840-754a04e38f6d-kube-api-access-n78j4\") pod \"3015a78a-8332-427e-8840-754a04e38f6d\" (UID: \"3015a78a-8332-427e-8840-754a04e38f6d\") " Feb 20 18:01:09 crc kubenswrapper[4697]: I0220 18:01:09.869222 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3015a78a-8332-427e-8840-754a04e38f6d-host" (OuterVolumeSpecName: "host") pod "3015a78a-8332-427e-8840-754a04e38f6d" (UID: "3015a78a-8332-427e-8840-754a04e38f6d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 18:01:09 crc kubenswrapper[4697]: I0220 18:01:09.869392 4697 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3015a78a-8332-427e-8840-754a04e38f6d-host\") on node \"crc\" DevicePath \"\"" Feb 20 18:01:09 crc kubenswrapper[4697]: I0220 18:01:09.874794 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3015a78a-8332-427e-8840-754a04e38f6d-kube-api-access-n78j4" (OuterVolumeSpecName: "kube-api-access-n78j4") pod "3015a78a-8332-427e-8840-754a04e38f6d" (UID: "3015a78a-8332-427e-8840-754a04e38f6d"). InnerVolumeSpecName "kube-api-access-n78j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:01:09 crc kubenswrapper[4697]: I0220 18:01:09.971793 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n78j4\" (UniqueName: \"kubernetes.io/projected/3015a78a-8332-427e-8840-754a04e38f6d-kube-api-access-n78j4\") on node \"crc\" DevicePath \"\"" Feb 20 18:01:10 crc kubenswrapper[4697]: I0220 18:01:10.367120 4697 scope.go:117] "RemoveContainer" containerID="e15aa9a416adf3432b740f3e987b44ae71ebd2fe20a33a177e86082ce29bcff6" Feb 20 18:01:10 crc kubenswrapper[4697]: I0220 18:01:10.367236 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/crc-debug-z8lbh" Feb 20 18:01:10 crc kubenswrapper[4697]: I0220 18:01:10.891198 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3015a78a-8332-427e-8840-754a04e38f6d" path="/var/lib/kubelet/pods/3015a78a-8332-427e-8840-754a04e38f6d/volumes" Feb 20 18:01:31 crc kubenswrapper[4697]: I0220 18:01:31.185078 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 18:01:31 crc kubenswrapper[4697]: I0220 18:01:31.185787 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 18:01:45 crc kubenswrapper[4697]: I0220 18:01:45.645602 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fddb45f9b-25kb9_5fcf8d33-fdb6-43e6-aade-d7dc55b3848c/barbican-api/0.log" Feb 20 18:01:45 crc kubenswrapper[4697]: I0220 18:01:45.759389 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fddb45f9b-25kb9_5fcf8d33-fdb6-43e6-aade-d7dc55b3848c/barbican-api-log/0.log" Feb 20 18:01:45 crc kubenswrapper[4697]: I0220 18:01:45.822346 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-674dd9ffc6-spdfl_71a69046-b0f7-4c26-a941-aba4a9475d0a/barbican-keystone-listener/0.log" Feb 20 18:01:45 crc kubenswrapper[4697]: I0220 18:01:45.909208 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-674dd9ffc6-spdfl_71a69046-b0f7-4c26-a941-aba4a9475d0a/barbican-keystone-listener-log/0.log" Feb 20 18:01:46 crc kubenswrapper[4697]: I0220 18:01:46.438179 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8474b565df-7h82r_9dc44740-061d-4c3a-9164-735d6da2dcf7/barbican-worker/0.log" Feb 20 18:01:46 crc kubenswrapper[4697]: I0220 18:01:46.497757 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8474b565df-7h82r_9dc44740-061d-4c3a-9164-735d6da2dcf7/barbican-worker-log/0.log" Feb 20 18:01:46 crc kubenswrapper[4697]: I0220 18:01:46.521726 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj_3343ad5b-476c-4e27-a5f7-e7948d8eed62/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:46 crc kubenswrapper[4697]: I0220 18:01:46.763854 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a3336f2b-7449-4ad5-9696-7565e147beab/ceilometer-notification-agent/0.log" Feb 20 18:01:46 crc kubenswrapper[4697]: I0220 18:01:46.792060 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a3336f2b-7449-4ad5-9696-7565e147beab/proxy-httpd/0.log" Feb 20 18:01:46 crc kubenswrapper[4697]: I0220 18:01:46.827531 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a3336f2b-7449-4ad5-9696-7565e147beab/ceilometer-central-agent/0.log" Feb 20 18:01:46 crc kubenswrapper[4697]: I0220 18:01:46.898711 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a3336f2b-7449-4ad5-9696-7565e147beab/sg-core/0.log" Feb 20 18:01:47 crc kubenswrapper[4697]: I0220 18:01:47.034696 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e8dd164d-3a39-4c58-99a0-1766204765bf/cinder-api-log/0.log" Feb 20 18:01:47 crc kubenswrapper[4697]: I0220 18:01:47.311737 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e9192fe2-573e-4d32-915c-535887423540/probe/0.log" Feb 20 18:01:47 crc kubenswrapper[4697]: I0220 18:01:47.523705 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e8dd164d-3a39-4c58-99a0-1766204765bf/cinder-api/0.log" Feb 20 18:01:47 crc kubenswrapper[4697]: I0220 18:01:47.583955 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8a927bbf-7945-44bd-9c0c-1f24b0af5b9a/cinder-scheduler/0.log" Feb 20 18:01:47 crc kubenswrapper[4697]: I0220 18:01:47.590694 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e9192fe2-573e-4d32-915c-535887423540/cinder-backup/0.log" Feb 20 18:01:47 crc kubenswrapper[4697]: I0220 18:01:47.730658 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8a927bbf-7945-44bd-9c0c-1f24b0af5b9a/probe/0.log" Feb 20 18:01:47 crc kubenswrapper[4697]: I0220 18:01:47.855241 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_0ca8173d-4029-4a57-80c9-c63c05842bb5/probe/0.log" Feb 20 18:01:47 crc kubenswrapper[4697]: I0220 18:01:47.954300 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_0ca8173d-4029-4a57-80c9-c63c05842bb5/cinder-volume/0.log" Feb 20 18:01:48 crc kubenswrapper[4697]: I0220 18:01:48.163579 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_59ea396a-fa78-4a1a-95e5-48f3e4e49bda/probe/0.log" Feb 20 18:01:48 crc kubenswrapper[4697]: I0220 18:01:48.181232 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_59ea396a-fa78-4a1a-95e5-48f3e4e49bda/cinder-volume/0.log" Feb 20 18:01:48 crc kubenswrapper[4697]: I0220 18:01:48.241036 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt_a2756b57-da81-4893-85d4-119fe103b4de/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:48 crc kubenswrapper[4697]: I0220 18:01:48.439845 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f_904a709b-1b1b-46d9-b2cd-d6517ff7ef07/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:48 crc kubenswrapper[4697]: I0220 18:01:48.469062 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5955df7555-qdwfx_af760d37-5cb2-4378-811d-cf343b5c9faf/init/0.log" Feb 20 18:01:48 crc kubenswrapper[4697]: I0220 18:01:48.680144 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5955df7555-qdwfx_af760d37-5cb2-4378-811d-cf343b5c9faf/init/0.log" Feb 20 18:01:48 crc kubenswrapper[4697]: I0220 18:01:48.693901 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6_671fae5d-08e8-4fac-ba16-e33a5a4f1f0b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:48 crc kubenswrapper[4697]: I0220 18:01:48.771609 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5955df7555-qdwfx_af760d37-5cb2-4378-811d-cf343b5c9faf/dnsmasq-dns/0.log" Feb 20 18:01:48 crc kubenswrapper[4697]: I0220 18:01:48.901111 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0/glance-httpd/0.log" Feb 20 18:01:48 crc kubenswrapper[4697]: I0220 18:01:48.905890 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0/glance-log/0.log" Feb 20 18:01:48 crc kubenswrapper[4697]: I0220 18:01:48.964451 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2eec23bf-874f-423a-8d8c-b3f20b494c87/glance-httpd/0.log" Feb 20 18:01:48 crc kubenswrapper[4697]: I0220 18:01:48.993645 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2eec23bf-874f-423a-8d8c-b3f20b494c87/glance-log/0.log" Feb 20 18:01:49 crc kubenswrapper[4697]: I0220 18:01:49.146251 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bc54df884-mx794_215f7a56-10a1-4ae5-9071-4983dbb45b35/horizon/0.log" Feb 20 18:01:49 crc kubenswrapper[4697]: I0220 18:01:49.211767 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq_04c7e6bd-f464-42ea-aa0b-a4b47a169d6f/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:49 crc kubenswrapper[4697]: I0220 18:01:49.384583 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-6mlzs_0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:49 crc kubenswrapper[4697]: I0220 18:01:49.690562 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29526781-56g9l_fb8e173e-11e6-4bc4-a87e-58fd25b53076/keystone-cron/0.log" Feb 20 18:01:49 crc kubenswrapper[4697]: I0220 18:01:49.812478 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29526841-57prn_594dde87-85f5-42d7-affc-466eb2311afc/keystone-cron/0.log" Feb 20 18:01:49 crc kubenswrapper[4697]: I0220 18:01:49.812722 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bc54df884-mx794_215f7a56-10a1-4ae5-9071-4983dbb45b35/horizon-log/0.log" Feb 20 18:01:49 crc kubenswrapper[4697]: I0220 18:01:49.971550 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_77b717c8-8b8d-4236-bd5e-95fb768f1f89/kube-state-metrics/0.log" Feb 20 18:01:49 crc kubenswrapper[4697]: I0220 18:01:49.985483 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85d8b4ccc6-tdklm_9e1ff974-0f05-484c-b267-f1537ec9495e/keystone-api/0.log" Feb 20 18:01:50 crc kubenswrapper[4697]: I0220 18:01:50.096663 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4_e6c6663b-45c5-4629-98fb-23de62292ee1/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:50 crc kubenswrapper[4697]: I0220 18:01:50.519865 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76b77c89fc-t9rjg_0e0dd99f-4186-47a0-b6ba-c8d4abd040b0/neutron-httpd/0.log" Feb 20 18:01:50 crc kubenswrapper[4697]: I0220 18:01:50.525972 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf_8190a9c0-1f92-4f97-8d67-04668a6920a2/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:50 crc kubenswrapper[4697]: I0220 18:01:50.592165 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76b77c89fc-t9rjg_0e0dd99f-4186-47a0-b6ba-c8d4abd040b0/neutron-api/0.log" Feb 20 18:01:50 crc kubenswrapper[4697]: I0220 18:01:50.678227 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_591f7e7d-78bf-43a5-afe2-119f93765311/setup-container/0.log" Feb 20 18:01:50 crc kubenswrapper[4697]: I0220 18:01:50.944157 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_591f7e7d-78bf-43a5-afe2-119f93765311/rabbitmq/0.log" Feb 20 18:01:50 crc kubenswrapper[4697]: I0220 18:01:50.951882 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_591f7e7d-78bf-43a5-afe2-119f93765311/setup-container/0.log" Feb 20 18:01:51 crc kubenswrapper[4697]: I0220 18:01:51.418492 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2a29103e-4075-486e-8107-34b4a75352cc/nova-cell0-conductor-conductor/0.log" Feb 20 18:01:51 crc kubenswrapper[4697]: I0220 18:01:51.747369 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a52da959-5d9a-4a66-8907-400b5bd0acfa/nova-cell1-conductor-conductor/0.log" Feb 20 18:01:52 crc kubenswrapper[4697]: I0220 18:01:52.004044 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_4b1ddcd6-abc4-467c-8cd1-1937c803e0b4/nova-cell1-novncproxy-novncproxy/0.log" Feb 20 18:01:52 crc kubenswrapper[4697]: I0220 18:01:52.232037 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-wzxdf_62242a65-ea27-495f-aa04-4a274f9e771a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:52 crc kubenswrapper[4697]: I0220 18:01:52.279258 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_947447ab-bffc-4330-9983-38789a4e8fcc/nova-api-log/0.log" Feb 20 18:01:52 crc kubenswrapper[4697]: I0220 18:01:52.569233 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5f42a083-98ed-403d-90ca-cc4ef5ba79d1/nova-metadata-log/0.log" Feb 20 18:01:52 crc kubenswrapper[4697]: I0220 18:01:52.585904 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_947447ab-bffc-4330-9983-38789a4e8fcc/nova-api-api/0.log" Feb 20 18:01:52 crc kubenswrapper[4697]: I0220 18:01:52.796910 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b33962a9-1867-4e1c-b597-d426ecf83e50/mysql-bootstrap/0.log" Feb 20 18:01:52 crc kubenswrapper[4697]: I0220 18:01:52.974679 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9bcd6e42-c512-4076-926f-b64c1da63a8a/nova-scheduler-scheduler/0.log" Feb 20 18:01:53 crc kubenswrapper[4697]: I0220 18:01:53.096953 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b33962a9-1867-4e1c-b597-d426ecf83e50/galera/0.log" Feb 20 18:01:53 crc kubenswrapper[4697]: I0220 18:01:53.120556 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b33962a9-1867-4e1c-b597-d426ecf83e50/mysql-bootstrap/0.log" Feb 20 18:01:53 crc kubenswrapper[4697]: I0220 18:01:53.315207 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33594c24-be5d-42de-ba91-5584becb21e3/mysql-bootstrap/0.log" Feb 20 18:01:53 crc kubenswrapper[4697]: I0220 18:01:53.473035 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33594c24-be5d-42de-ba91-5584becb21e3/mysql-bootstrap/0.log" Feb 20 18:01:53 crc kubenswrapper[4697]: I0220 18:01:53.510608 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33594c24-be5d-42de-ba91-5584becb21e3/galera/0.log" Feb 20 18:01:54 crc kubenswrapper[4697]: I0220 18:01:54.174609 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8dae4cc2-1fb9-47ff-af11-854c15a884a3/openstackclient/0.log" Feb 20 18:01:54 crc kubenswrapper[4697]: I0220 18:01:54.196491 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-62kkp_aa3e6a99-a9a4-4578-94c6-8a4b641405ec/ovn-controller/0.log" Feb 20 18:01:54 crc kubenswrapper[4697]: I0220 18:01:54.353670 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wrhqd_650a1475-c144-4a21-a156-9859cb1418d4/openstack-network-exporter/0.log" Feb 20 18:01:54 crc kubenswrapper[4697]: I0220 18:01:54.540991 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5f42a083-98ed-403d-90ca-cc4ef5ba79d1/nova-metadata-metadata/0.log" Feb 20 18:01:54 crc kubenswrapper[4697]: I0220 18:01:54.587624 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7wh5h_be3e14f5-1877-4618-87b6-b60623792988/ovsdb-server-init/0.log" Feb 20 18:01:54 crc kubenswrapper[4697]: I0220 18:01:54.722409 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7wh5h_be3e14f5-1877-4618-87b6-b60623792988/ovsdb-server-init/0.log" Feb 20 18:01:54 crc kubenswrapper[4697]: I0220 18:01:54.830294 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7wh5h_be3e14f5-1877-4618-87b6-b60623792988/ovsdb-server/0.log" Feb 20 18:01:54 crc kubenswrapper[4697]: I0220 18:01:54.934466 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-lc2h7_b1aebd55-3d79-403b-978d-04afedd25c3d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:55 crc kubenswrapper[4697]: I0220 18:01:55.019764 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00989e93-9419-47e9-a3ba-7b5e65910be9/openstack-network-exporter/0.log" Feb 20 18:01:55 crc kubenswrapper[4697]: I0220 18:01:55.078790 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7wh5h_be3e14f5-1877-4618-87b6-b60623792988/ovs-vswitchd/0.log" Feb 20 18:01:55 crc kubenswrapper[4697]: I0220 18:01:55.149999 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00989e93-9419-47e9-a3ba-7b5e65910be9/ovn-northd/0.log" Feb 20 18:01:55 crc kubenswrapper[4697]: I0220 18:01:55.243039 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fb5d6748-960a-41d3-a11e-6dd21c3dd46f/openstack-network-exporter/0.log" Feb 20 18:01:55 crc kubenswrapper[4697]: I0220 18:01:55.481597 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fb5d6748-960a-41d3-a11e-6dd21c3dd46f/ovsdbserver-nb/0.log" Feb 20 18:01:55 crc kubenswrapper[4697]: I0220 18:01:55.859641 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b179bb2c-61ea-4bca-860b-b419bb8d3341/ovsdbserver-sb/0.log" Feb 20 18:01:55 crc kubenswrapper[4697]: I0220 18:01:55.862675 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b179bb2c-61ea-4bca-860b-b419bb8d3341/openstack-network-exporter/0.log" Feb 20 18:01:56 crc kubenswrapper[4697]: I0220 18:01:56.018785 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57d8cdd7b4-pxpls_8cecab8a-e2db-476f-89d1-b28b4f585d57/placement-api/0.log" Feb 20 18:01:56 crc kubenswrapper[4697]: I0220 18:01:56.165754 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2cb601d7-bad1-4085-9519-4cb9927fa531/init-config-reloader/0.log" Feb 20 18:01:56 crc kubenswrapper[4697]: I0220 18:01:56.260619 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57d8cdd7b4-pxpls_8cecab8a-e2db-476f-89d1-b28b4f585d57/placement-log/0.log" Feb 20 18:01:56 crc kubenswrapper[4697]: I0220 18:01:56.316521 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2cb601d7-bad1-4085-9519-4cb9927fa531/init-config-reloader/0.log" Feb 20 18:01:56 crc kubenswrapper[4697]: I0220 18:01:56.376206 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2cb601d7-bad1-4085-9519-4cb9927fa531/config-reloader/0.log" Feb 20 18:01:56 crc kubenswrapper[4697]: I0220 18:01:56.381511 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2cb601d7-bad1-4085-9519-4cb9927fa531/prometheus/0.log" Feb 20 18:01:56 crc kubenswrapper[4697]: I0220 18:01:56.494945 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2cb601d7-bad1-4085-9519-4cb9927fa531/thanos-sidecar/0.log" Feb 20 18:01:56 crc kubenswrapper[4697]: I0220 18:01:56.555087 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_12a44ac2-6e80-4bac-9079-0b6637de700a/setup-container/0.log" Feb 20 18:01:56 crc kubenswrapper[4697]: I0220 18:01:56.793312 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9972c409-92f3-4ec7-9b59-cccd334b761e/setup-container/0.log" Feb 20 18:01:56 crc kubenswrapper[4697]: I0220 18:01:56.795045 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_12a44ac2-6e80-4bac-9079-0b6637de700a/setup-container/0.log" Feb 20 18:01:56 crc kubenswrapper[4697]: I0220 18:01:56.815675 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_12a44ac2-6e80-4bac-9079-0b6637de700a/rabbitmq/0.log" Feb 20 18:01:57 crc kubenswrapper[4697]: I0220 18:01:57.036771 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9972c409-92f3-4ec7-9b59-cccd334b761e/setup-container/0.log" Feb 20 18:01:57 crc kubenswrapper[4697]: I0220 18:01:57.060241 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9972c409-92f3-4ec7-9b59-cccd334b761e/rabbitmq/0.log" Feb 20 18:01:57 crc kubenswrapper[4697]: I0220 18:01:57.092959 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk_eace25c0-d234-43c5-88a0-f8ba1fc78dac/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:57 crc kubenswrapper[4697]: I0220 18:01:57.281587 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-gp4g2_603c4295-1159-4a8b-856f-c40cb2a0838c/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:57 crc kubenswrapper[4697]: I0220 18:01:57.364536 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf_143cf213-8274-47bd-b6f4-80f2d465275c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:57 crc kubenswrapper[4697]: I0220 18:01:57.552410 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7tfq6_0700858d-9b11-4cca-a80c-143da84eea6e/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:57 crc kubenswrapper[4697]: I0220 18:01:57.602852 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-57pvl_ab058f87-f768-4d3a-b7cf-a39feef5c5f6/ssh-known-hosts-edpm-deployment/0.log" Feb 20 18:01:57 crc kubenswrapper[4697]: I0220 18:01:57.681259 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_19a9810e-52c4-4428-9fca-5bf65d100f50/memcached/0.log" Feb 20 18:01:57 crc kubenswrapper[4697]: I0220 18:01:57.833158 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d485f4f89-v5ctf_536e289e-762f-4f9f-8b58-027b09cf2609/proxy-server/0.log" Feb 20 18:01:57 crc kubenswrapper[4697]: I0220 18:01:57.886372 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d485f4f89-v5ctf_536e289e-762f-4f9f-8b58-027b09cf2609/proxy-httpd/0.log" Feb 20 18:01:57 crc kubenswrapper[4697]: I0220 18:01:57.910440 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-frqkk_a267ce98-60eb-4c3c-8906-de42f6872680/swift-ring-rebalance/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.032380 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/account-auditor/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.051811 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/account-reaper/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.079844 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/account-server/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.120614 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/account-replicator/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.141075 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/container-auditor/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.234123 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/container-server/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.240515 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/container-replicator/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.276578 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/container-updater/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.312085 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/object-expirer/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.324224 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/object-auditor/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.427790 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/object-replicator/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.445082 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/object-updater/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.447722 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/object-server/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.513624 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/swift-recon-cron/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.537645 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/rsync/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.675689 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9st56_db0ef420-9810-4b2f-8f10-b3fb710293c6/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.723327 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e83acb01-3a91-4950-848f-d447679c0533/tempest-tests-tempest-tests-runner/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.897599 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_1b384505-09b4-453d-9418-b3116dfe429e/test-operator-logs-container/0.log" Feb 20 18:01:58 crc kubenswrapper[4697]: I0220 18:01:58.936582 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8_0ed71fc6-e5c0-40fe-988e-04a30088f620/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:01:59 crc kubenswrapper[4697]: I0220 18:01:59.611363 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_14b3fb67-823a-4da5-a42d-27745717ba8b/watcher-applier/0.log" Feb 20 18:02:00 crc kubenswrapper[4697]: I0220 18:02:00.302845 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_04b1d7c5-4d8e-4962-b396-855adb2605d7/watcher-api-log/0.log" Feb 20 18:02:01 crc kubenswrapper[4697]: I0220 18:02:01.184173 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 18:02:01 crc kubenswrapper[4697]: I0220 18:02:01.184223 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 18:02:02 crc kubenswrapper[4697]: I0220 18:02:02.151364 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_21f4378d-bb26-408d-9613-82246765639b/watcher-decision-engine/0.log" Feb 20 18:02:02 crc kubenswrapper[4697]: I0220 18:02:02.822093 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_04b1d7c5-4d8e-4962-b396-855adb2605d7/watcher-api/0.log" Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.705201 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fhwfv"] Feb 20 18:02:21 crc kubenswrapper[4697]: E0220 18:02:21.706128 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3015a78a-8332-427e-8840-754a04e38f6d" containerName="container-00" Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.706141 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="3015a78a-8332-427e-8840-754a04e38f6d" containerName="container-00" Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.706316 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="3015a78a-8332-427e-8840-754a04e38f6d" containerName="container-00" Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.707778 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.729417 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhwfv"] Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.780224 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk4nm\" (UniqueName: \"kubernetes.io/projected/991ce506-ec9a-4718-bb8c-2cb3a3161bba-kube-api-access-hk4nm\") pod \"community-operators-fhwfv\" (UID: \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\") " pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.780320 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/991ce506-ec9a-4718-bb8c-2cb3a3161bba-catalog-content\") pod \"community-operators-fhwfv\" (UID: \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\") " pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.780375 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/991ce506-ec9a-4718-bb8c-2cb3a3161bba-utilities\") pod \"community-operators-fhwfv\" (UID: \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\") " pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.881424 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk4nm\" (UniqueName: \"kubernetes.io/projected/991ce506-ec9a-4718-bb8c-2cb3a3161bba-kube-api-access-hk4nm\") pod \"community-operators-fhwfv\" (UID: \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\") " pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.881626 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/991ce506-ec9a-4718-bb8c-2cb3a3161bba-catalog-content\") pod \"community-operators-fhwfv\" (UID: \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\") " pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.881706 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/991ce506-ec9a-4718-bb8c-2cb3a3161bba-utilities\") pod \"community-operators-fhwfv\" (UID: \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\") " pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.882165 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/991ce506-ec9a-4718-bb8c-2cb3a3161bba-catalog-content\") pod \"community-operators-fhwfv\" (UID: \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\") " pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:21 crc kubenswrapper[4697]: I0220 18:02:21.882235 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/991ce506-ec9a-4718-bb8c-2cb3a3161bba-utilities\") pod \"community-operators-fhwfv\" (UID: \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\") " pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:22 crc kubenswrapper[4697]: I0220 18:02:22.280792 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk4nm\" (UniqueName: \"kubernetes.io/projected/991ce506-ec9a-4718-bb8c-2cb3a3161bba-kube-api-access-hk4nm\") pod \"community-operators-fhwfv\" (UID: \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\") " pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:22 crc kubenswrapper[4697]: I0220 18:02:22.379743 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:22 crc kubenswrapper[4697]: I0220 18:02:22.904704 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhwfv"] Feb 20 18:02:23 crc kubenswrapper[4697]: I0220 18:02:23.032851 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhwfv" event={"ID":"991ce506-ec9a-4718-bb8c-2cb3a3161bba","Type":"ContainerStarted","Data":"42d5e020096eb66d8519e6e78e29298f94559e086e30f30eb0d53c0aaed39271"} Feb 20 18:02:24 crc kubenswrapper[4697]: I0220 18:02:24.043962 4697 generic.go:334] "Generic (PLEG): container finished" podID="991ce506-ec9a-4718-bb8c-2cb3a3161bba" containerID="fd7d1ce3691437682d955181649b91ef535d9b197929ae59202ff42fe84b07fc" exitCode=0 Feb 20 18:02:24 crc kubenswrapper[4697]: I0220 18:02:24.044077 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhwfv" event={"ID":"991ce506-ec9a-4718-bb8c-2cb3a3161bba","Type":"ContainerDied","Data":"fd7d1ce3691437682d955181649b91ef535d9b197929ae59202ff42fe84b07fc"} Feb 20 18:02:25 crc kubenswrapper[4697]: I0220 18:02:25.053342 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhwfv" event={"ID":"991ce506-ec9a-4718-bb8c-2cb3a3161bba","Type":"ContainerStarted","Data":"333b898ea0c74bcd76be71016dd278989461150a5f83dd144eb1615f4f9868f7"} Feb 20 18:02:27 crc kubenswrapper[4697]: I0220 18:02:27.075068 4697 generic.go:334] "Generic (PLEG): container finished" podID="991ce506-ec9a-4718-bb8c-2cb3a3161bba" containerID="333b898ea0c74bcd76be71016dd278989461150a5f83dd144eb1615f4f9868f7" exitCode=0 Feb 20 18:02:27 crc kubenswrapper[4697]: I0220 18:02:27.075126 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhwfv" event={"ID":"991ce506-ec9a-4718-bb8c-2cb3a3161bba","Type":"ContainerDied","Data":"333b898ea0c74bcd76be71016dd278989461150a5f83dd144eb1615f4f9868f7"} Feb 20 18:02:28 crc kubenswrapper[4697]: I0220 18:02:28.085848 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhwfv" event={"ID":"991ce506-ec9a-4718-bb8c-2cb3a3161bba","Type":"ContainerStarted","Data":"8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e"} Feb 20 18:02:29 crc kubenswrapper[4697]: I0220 18:02:29.592651 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/util/0.log" Feb 20 18:02:29 crc kubenswrapper[4697]: I0220 18:02:29.948693 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/pull/0.log" Feb 20 18:02:29 crc kubenswrapper[4697]: I0220 18:02:29.973481 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/util/0.log" Feb 20 18:02:30 crc kubenswrapper[4697]: I0220 18:02:30.207446 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/pull/0.log" Feb 20 18:02:30 crc kubenswrapper[4697]: I0220 18:02:30.401598 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/util/0.log" Feb 20 18:02:30 crc kubenswrapper[4697]: I0220 18:02:30.432899 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/pull/0.log" Feb 20 18:02:30 crc kubenswrapper[4697]: I0220 18:02:30.631393 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/extract/0.log" Feb 20 18:02:30 crc kubenswrapper[4697]: I0220 18:02:30.837906 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-jxdb5_f56f8133-887c-456a-9cbf-6df7713789b3/manager/0.log" Feb 20 18:02:31 crc kubenswrapper[4697]: I0220 18:02:31.180939 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-5xkg5_71f3f3ad-1f6c-4d59-9fc8-b036014c1068/manager/0.log" Feb 20 18:02:31 crc kubenswrapper[4697]: I0220 18:02:31.184649 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 18:02:31 crc kubenswrapper[4697]: I0220 18:02:31.184786 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 18:02:31 crc kubenswrapper[4697]: I0220 18:02:31.184899 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 18:02:31 crc kubenswrapper[4697]: I0220 18:02:31.185603 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f70c8b9f6708d91f911ce6e63dfc4940e276016d2a4151ff12953e15e5419aa9"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 18:02:31 crc kubenswrapper[4697]: I0220 18:02:31.185731 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://f70c8b9f6708d91f911ce6e63dfc4940e276016d2a4151ff12953e15e5419aa9" gracePeriod=600 Feb 20 18:02:31 crc kubenswrapper[4697]: I0220 18:02:31.316554 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-4nqmq_d233b891-dac3-4565-971b-85141828260d/manager/0.log" Feb 20 18:02:31 crc kubenswrapper[4697]: I0220 18:02:31.532398 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-qjfrm_b0605f71-51c5-49d9-8936-77affb7cf0bf/manager/0.log" Feb 20 18:02:31 crc kubenswrapper[4697]: I0220 18:02:31.859856 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-9zdp9_0b5f03ab-32bb-48ef-b7d7-1ede5fb51924/manager/0.log" Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.124251 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="f70c8b9f6708d91f911ce6e63dfc4940e276016d2a4151ff12953e15e5419aa9" exitCode=0 Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.124284 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"f70c8b9f6708d91f911ce6e63dfc4940e276016d2a4151ff12953e15e5419aa9"} Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.124333 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6"} Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.124351 4697 scope.go:117] "RemoveContainer" containerID="0e8d619d411b8d1963615bea87a74186d2fd898fb29f875e55f9a1d66d1ca712" Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.152841 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fhwfv" podStartSLOduration=7.652747801 podStartE2EDuration="11.152819096s" podCreationTimestamp="2026-02-20 18:02:21 +0000 UTC" firstStartedPulling="2026-02-20 18:02:24.046635094 +0000 UTC m=+5451.826680512" lastFinishedPulling="2026-02-20 18:02:27.546706399 +0000 UTC m=+5455.326751807" observedRunningTime="2026-02-20 18:02:28.104980848 +0000 UTC m=+5455.885026286" watchObservedRunningTime="2026-02-20 18:02:32.152819096 +0000 UTC m=+5459.932864504" Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.354944 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-82v6j_d9926f6e-afca-48ad-8a52-fb7f53ba3dec/manager/0.log" Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.380598 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.380665 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.431263 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.660363 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-j9lmt_d30d696e-1555-4fc2-9316-c795de608048/manager/0.log" Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.727476 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-rw4q5_4fdb933f-aa86-4b88-9b08-4783ce0f6e0c/manager/0.log" Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.981550 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-gsmns_366fd13a-060b-4572-9541-dbf88a507588/manager/0.log" Feb 20 18:02:32 crc kubenswrapper[4697]: I0220 18:02:32.992126 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-dknf8_560f1df6-c03f-42ad-8175-5508f56e1ecc/manager/0.log" Feb 20 18:02:33 crc kubenswrapper[4697]: I0220 18:02:33.748745 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:33 crc kubenswrapper[4697]: I0220 18:02:33.806215 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhwfv"] Feb 20 18:02:33 crc kubenswrapper[4697]: I0220 18:02:33.895558 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-ct4nk_21c4fbfe-19b7-4303-b4bf-72dbc90044dd/manager/0.log" Feb 20 18:02:34 crc kubenswrapper[4697]: I0220 18:02:34.014257 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-4mtzl_cae8d6b1-4649-40bb-b710-1197ac78db1b/manager/0.log" Feb 20 18:02:34 crc kubenswrapper[4697]: I0220 18:02:34.413102 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8_3f3b7ed7-e806-4fa9-ac88-381c0b4bd237/manager/0.log" Feb 20 18:02:34 crc kubenswrapper[4697]: I0220 18:02:34.800206 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-64dbc77f9f-pqx8x_9df0ae9c-f41a-4c92-b62e-ff0f230da65c/operator/0.log" Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.011947 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-4nlwj_b3087f12-ba44-4ef2-af22-3d77e30b1d84/registry-server/0.log" Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.158779 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fhwfv" podUID="991ce506-ec9a-4718-bb8c-2cb3a3161bba" containerName="registry-server" containerID="cri-o://8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e" gracePeriod=2 Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.402588 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-lsgnq_55a77be1-9486-4b8a-acc6-a4d8532016d3/manager/0.log" Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.617499 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-l5qll_8c8d2c10-e4b6-4d37-977f-7ad685981d2f/manager/0.log" Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.721707 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.862296 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/991ce506-ec9a-4718-bb8c-2cb3a3161bba-catalog-content\") pod \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\" (UID: \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\") " Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.862452 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/991ce506-ec9a-4718-bb8c-2cb3a3161bba-utilities\") pod \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\" (UID: \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\") " Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.862475 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk4nm\" (UniqueName: \"kubernetes.io/projected/991ce506-ec9a-4718-bb8c-2cb3a3161bba-kube-api-access-hk4nm\") pod \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\" (UID: \"991ce506-ec9a-4718-bb8c-2cb3a3161bba\") " Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.863271 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/991ce506-ec9a-4718-bb8c-2cb3a3161bba-utilities" (OuterVolumeSpecName: "utilities") pod "991ce506-ec9a-4718-bb8c-2cb3a3161bba" (UID: "991ce506-ec9a-4718-bb8c-2cb3a3161bba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.869469 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991ce506-ec9a-4718-bb8c-2cb3a3161bba-kube-api-access-hk4nm" (OuterVolumeSpecName: "kube-api-access-hk4nm") pod "991ce506-ec9a-4718-bb8c-2cb3a3161bba" (UID: "991ce506-ec9a-4718-bb8c-2cb3a3161bba"). InnerVolumeSpecName "kube-api-access-hk4nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.877613 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xnpms_8546e4ea-d7f0-4244-8496-e962809c4203/operator/0.log" Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.943021 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/991ce506-ec9a-4718-bb8c-2cb3a3161bba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "991ce506-ec9a-4718-bb8c-2cb3a3161bba" (UID: "991ce506-ec9a-4718-bb8c-2cb3a3161bba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.964595 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/991ce506-ec9a-4718-bb8c-2cb3a3161bba-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.964623 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/991ce506-ec9a-4718-bb8c-2cb3a3161bba-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 18:02:35 crc kubenswrapper[4697]: I0220 18:02:35.964633 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk4nm\" (UniqueName: \"kubernetes.io/projected/991ce506-ec9a-4718-bb8c-2cb3a3161bba-kube-api-access-hk4nm\") on node \"crc\" DevicePath \"\"" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.132050 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-2mxvz_cb67b5ad-353e-4d96-8d94-fc69e4801f64/manager/0.log" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.169302 4697 generic.go:334] "Generic (PLEG): container finished" podID="991ce506-ec9a-4718-bb8c-2cb3a3161bba" containerID="8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e" exitCode=0 Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.169348 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhwfv" event={"ID":"991ce506-ec9a-4718-bb8c-2cb3a3161bba","Type":"ContainerDied","Data":"8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e"} Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.169374 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhwfv" event={"ID":"991ce506-ec9a-4718-bb8c-2cb3a3161bba","Type":"ContainerDied","Data":"42d5e020096eb66d8519e6e78e29298f94559e086e30f30eb0d53c0aaed39271"} Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.169383 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhwfv" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.169391 4697 scope.go:117] "RemoveContainer" containerID="8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.214102 4697 scope.go:117] "RemoveContainer" containerID="333b898ea0c74bcd76be71016dd278989461150a5f83dd144eb1615f4f9868f7" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.226476 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhwfv"] Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.242666 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fhwfv"] Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.251614 4697 scope.go:117] "RemoveContainer" containerID="fd7d1ce3691437682d955181649b91ef535d9b197929ae59202ff42fe84b07fc" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.290572 4697 scope.go:117] "RemoveContainer" containerID="8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e" Feb 20 18:02:36 crc kubenswrapper[4697]: E0220 18:02:36.291085 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e\": container with ID starting with 8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e not found: ID does not exist" containerID="8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.291129 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e"} err="failed to get container status \"8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e\": rpc error: code = NotFound desc = could not find container \"8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e\": container with ID starting with 8ac8d3ce3f391ffc4da9f0cab4d715a7113aa40478e5f64af360bded958e9b7e not found: ID does not exist" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.291175 4697 scope.go:117] "RemoveContainer" containerID="333b898ea0c74bcd76be71016dd278989461150a5f83dd144eb1615f4f9868f7" Feb 20 18:02:36 crc kubenswrapper[4697]: E0220 18:02:36.291420 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333b898ea0c74bcd76be71016dd278989461150a5f83dd144eb1615f4f9868f7\": container with ID starting with 333b898ea0c74bcd76be71016dd278989461150a5f83dd144eb1615f4f9868f7 not found: ID does not exist" containerID="333b898ea0c74bcd76be71016dd278989461150a5f83dd144eb1615f4f9868f7" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.291478 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333b898ea0c74bcd76be71016dd278989461150a5f83dd144eb1615f4f9868f7"} err="failed to get container status \"333b898ea0c74bcd76be71016dd278989461150a5f83dd144eb1615f4f9868f7\": rpc error: code = NotFound desc = could not find container \"333b898ea0c74bcd76be71016dd278989461150a5f83dd144eb1615f4f9868f7\": container with ID starting with 333b898ea0c74bcd76be71016dd278989461150a5f83dd144eb1615f4f9868f7 not found: ID does not exist" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.291505 4697 scope.go:117] "RemoveContainer" containerID="fd7d1ce3691437682d955181649b91ef535d9b197929ae59202ff42fe84b07fc" Feb 20 18:02:36 crc kubenswrapper[4697]: E0220 18:02:36.291847 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7d1ce3691437682d955181649b91ef535d9b197929ae59202ff42fe84b07fc\": container with ID starting with fd7d1ce3691437682d955181649b91ef535d9b197929ae59202ff42fe84b07fc not found: ID does not exist" containerID="fd7d1ce3691437682d955181649b91ef535d9b197929ae59202ff42fe84b07fc" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.291890 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7d1ce3691437682d955181649b91ef535d9b197929ae59202ff42fe84b07fc"} err="failed to get container status \"fd7d1ce3691437682d955181649b91ef535d9b197929ae59202ff42fe84b07fc\": rpc error: code = NotFound desc = could not find container \"fd7d1ce3691437682d955181649b91ef535d9b197929ae59202ff42fe84b07fc\": container with ID starting with fd7d1ce3691437682d955181649b91ef535d9b197929ae59202ff42fe84b07fc not found: ID does not exist" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.692079 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-wjwzl_6af9a0a9-0546-4a59-bdca-1a0609421010/manager/0.log" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.714922 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-b89r4_a675eb01-18af-4776-94e6-64c0b392248b/manager/0.log" Feb 20 18:02:36 crc kubenswrapper[4697]: I0220 18:02:36.893615 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="991ce506-ec9a-4718-bb8c-2cb3a3161bba" path="/var/lib/kubelet/pods/991ce506-ec9a-4718-bb8c-2cb3a3161bba/volumes" Feb 20 18:02:37 crc kubenswrapper[4697]: I0220 18:02:37.094549 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-9d9d9f9cd-t7nvz_d8c72591-eb8d-4553-867d-60482d51c4db/manager/0.log" Feb 20 18:02:37 crc kubenswrapper[4697]: I0220 18:02:37.441232 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-b45cc898b-82j7k_33e3fc43-dbfe-4fff-bac3-6021dfa84982/manager/0.log" Feb 20 18:02:37 crc kubenswrapper[4697]: I0220 18:02:37.488214 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-7jlc2_8b48098d-ef4c-4cde-beef-a7c34573699b/manager/0.log" Feb 20 18:02:41 crc kubenswrapper[4697]: I0220 18:02:41.240369 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-kqjzr_fcf00eef-940f-4da3-8359-325f1abb0c6d/manager/0.log" Feb 20 18:02:59 crc kubenswrapper[4697]: I0220 18:02:59.890792 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dkznt_8912bf5c-045e-4c86-9a09-41c4cab10139/control-plane-machine-set-operator/0.log" Feb 20 18:03:00 crc kubenswrapper[4697]: I0220 18:03:00.084968 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f6d4m_ca43f4e8-ab22-4573-b5cb-5d58dbf788f1/kube-rbac-proxy/0.log" Feb 20 18:03:00 crc kubenswrapper[4697]: I0220 18:03:00.131607 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f6d4m_ca43f4e8-ab22-4573-b5cb-5d58dbf788f1/machine-api-operator/0.log" Feb 20 18:03:15 crc kubenswrapper[4697]: I0220 18:03:15.196980 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-x2rxq_ee3c17f3-a89c-49fa-8cf9-75e4914401cc/cert-manager-controller/0.log" Feb 20 18:03:15 crc kubenswrapper[4697]: I0220 18:03:15.323480 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-rs7j9_2c82f3c1-cc18-41b5-9489-61f52f31a74a/cert-manager-cainjector/0.log" Feb 20 18:03:15 crc kubenswrapper[4697]: I0220 18:03:15.402617 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-zd9ck_36e190bc-14ce-4ce6-92ac-d48197515527/cert-manager-webhook/0.log" Feb 20 18:03:27 crc kubenswrapper[4697]: I0220 18:03:27.860838 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-9x7l9_d88c9050-548c-4305-b532-085c83436b3e/nmstate-console-plugin/0.log" Feb 20 18:03:28 crc kubenswrapper[4697]: I0220 18:03:28.185930 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mmnb6_89578f54-097d-4b2b-9809-103034a3a114/nmstate-handler/0.log" Feb 20 18:03:28 crc kubenswrapper[4697]: I0220 18:03:28.341740 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-vwprb_7b12f1ce-9e0b-458b-990a-e38c2f2139c5/kube-rbac-proxy/0.log" Feb 20 18:03:28 crc kubenswrapper[4697]: I0220 18:03:28.354361 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-vwprb_7b12f1ce-9e0b-458b-990a-e38c2f2139c5/nmstate-metrics/0.log" Feb 20 18:03:28 crc kubenswrapper[4697]: I0220 18:03:28.570076 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-ljdjd_278cea78-f0e4-4785-bc49-aa335706ccac/nmstate-operator/0.log" Feb 20 18:03:28 crc kubenswrapper[4697]: I0220 18:03:28.592194 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-xmzrc_567d32fd-31d2-4822-b487-ec35c250663d/nmstate-webhook/0.log" Feb 20 18:03:42 crc kubenswrapper[4697]: I0220 18:03:42.373357 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-cqddv_214e6680-a3c8-4286-9c91-d68893ba73be/prometheus-operator/0.log" Feb 20 18:03:42 crc kubenswrapper[4697]: I0220 18:03:42.606643 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5_565bbe20-d8eb-4878-a048-4d78d8123f6d/prometheus-operator-admission-webhook/0.log" Feb 20 18:03:42 crc kubenswrapper[4697]: I0220 18:03:42.620726 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8679bd8497-phkqw_68c668ba-cbcc-4330-95e2-012c78108925/prometheus-operator-admission-webhook/0.log" Feb 20 18:03:42 crc kubenswrapper[4697]: I0220 18:03:42.788400 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8wmf5_12db7fd9-ce39-43cf-99b7-3a56791c0390/operator/0.log" Feb 20 18:03:42 crc kubenswrapper[4697]: I0220 18:03:42.885998 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-j5gtw_94f7dd38-b255-46cd-8b05-4c720857dd86/perses-operator/0.log" Feb 20 18:03:56 crc kubenswrapper[4697]: I0220 18:03:56.619265 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-64qzx_e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3/controller/0.log" Feb 20 18:03:56 crc kubenswrapper[4697]: I0220 18:03:56.621412 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-64qzx_e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3/kube-rbac-proxy/0.log" Feb 20 18:03:56 crc kubenswrapper[4697]: I0220 18:03:56.788469 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-frr-files/0.log" Feb 20 18:03:56 crc kubenswrapper[4697]: I0220 18:03:56.995465 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-frr-files/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.010015 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-reloader/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.015297 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-metrics/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.055383 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-reloader/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.211012 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-frr-files/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.219532 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-reloader/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.243316 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-metrics/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.276472 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-metrics/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.450980 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-metrics/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.463975 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/controller/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.476245 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-frr-files/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.493834 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-reloader/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.627916 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/frr-metrics/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.664357 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/kube-rbac-proxy/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.713561 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/kube-rbac-proxy-frr/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.835660 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/reloader/0.log" Feb 20 18:03:57 crc kubenswrapper[4697]: I0220 18:03:57.933481 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-lk6qx_08c8d96c-3974-4e6f-ae8e-7283a628643e/frr-k8s-webhook-server/0.log" Feb 20 18:03:58 crc kubenswrapper[4697]: I0220 18:03:58.243743 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6574bdbb48-rfq5g_538df488-7224-4d8b-a08c-463865282008/manager/0.log" Feb 20 18:03:58 crc kubenswrapper[4697]: I0220 18:03:58.413513 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-77767df4c8-b6xsv_50d070c4-1559-40ac-9375-96ddffeb6b1a/webhook-server/0.log" Feb 20 18:03:58 crc kubenswrapper[4697]: I0220 18:03:58.463999 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fzjzn_7d6d1b55-ac21-4967-9487-53b7b236b847/kube-rbac-proxy/0.log" Feb 20 18:03:59 crc kubenswrapper[4697]: I0220 18:03:59.157504 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fzjzn_7d6d1b55-ac21-4967-9487-53b7b236b847/speaker/0.log" Feb 20 18:03:59 crc kubenswrapper[4697]: I0220 18:03:59.280229 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/frr/0.log" Feb 20 18:04:12 crc kubenswrapper[4697]: I0220 18:04:12.471725 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/util/0.log" Feb 20 18:04:12 crc kubenswrapper[4697]: I0220 18:04:12.663124 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/pull/0.log" Feb 20 18:04:12 crc kubenswrapper[4697]: I0220 18:04:12.673174 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/util/0.log" Feb 20 18:04:12 crc kubenswrapper[4697]: I0220 18:04:12.679883 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/pull/0.log" Feb 20 18:04:12 crc kubenswrapper[4697]: I0220 18:04:12.870647 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/util/0.log" Feb 20 18:04:12 crc kubenswrapper[4697]: I0220 18:04:12.887256 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/extract/0.log" Feb 20 18:04:12 crc kubenswrapper[4697]: I0220 18:04:12.922505 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/pull/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.069206 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/util/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.206134 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/pull/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.229232 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/util/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.251897 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/pull/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.429703 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/extract/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.443028 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/util/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.449556 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/pull/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.610580 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/extract-utilities/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.792544 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/extract-content/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.805746 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/extract-content/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.812289 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/extract-utilities/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.983640 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/extract-content/0.log" Feb 20 18:04:13 crc kubenswrapper[4697]: I0220 18:04:13.996301 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/extract-utilities/0.log" Feb 20 18:04:14 crc kubenswrapper[4697]: I0220 18:04:14.221629 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/extract-utilities/0.log" Feb 20 18:04:14 crc kubenswrapper[4697]: I0220 18:04:14.496396 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/extract-utilities/0.log" Feb 20 18:04:14 crc kubenswrapper[4697]: I0220 18:04:14.496653 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/extract-content/0.log" Feb 20 18:04:14 crc kubenswrapper[4697]: I0220 18:04:14.525971 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/extract-content/0.log" Feb 20 18:04:14 crc kubenswrapper[4697]: I0220 18:04:14.672414 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/registry-server/0.log" Feb 20 18:04:14 crc kubenswrapper[4697]: I0220 18:04:14.672938 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/extract-utilities/0.log" Feb 20 18:04:14 crc kubenswrapper[4697]: I0220 18:04:14.703032 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/extract-content/0.log" Feb 20 18:04:14 crc kubenswrapper[4697]: I0220 18:04:14.857606 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/util/0.log" Feb 20 18:04:15 crc kubenswrapper[4697]: I0220 18:04:15.063595 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/pull/0.log" Feb 20 18:04:15 crc kubenswrapper[4697]: I0220 18:04:15.097506 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/util/0.log" Feb 20 18:04:15 crc kubenswrapper[4697]: I0220 18:04:15.191505 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/pull/0.log" Feb 20 18:04:15 crc kubenswrapper[4697]: I0220 18:04:15.427711 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/extract/0.log" Feb 20 18:04:15 crc kubenswrapper[4697]: I0220 18:04:15.433691 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/util/0.log" Feb 20 18:04:15 crc kubenswrapper[4697]: I0220 18:04:15.437988 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/pull/0.log" Feb 20 18:04:15 crc kubenswrapper[4697]: I0220 18:04:15.604262 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/registry-server/0.log" Feb 20 18:04:15 crc kubenswrapper[4697]: I0220 18:04:15.913273 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-shppd_84dd2186-876f-440f-8187-51f7bdda1bb8/marketplace-operator/0.log" Feb 20 18:04:15 crc kubenswrapper[4697]: I0220 18:04:15.984412 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/extract-utilities/0.log" Feb 20 18:04:16 crc kubenswrapper[4697]: I0220 18:04:16.291318 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/extract-utilities/0.log" Feb 20 18:04:16 crc kubenswrapper[4697]: I0220 18:04:16.337111 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/extract-content/0.log" Feb 20 18:04:16 crc kubenswrapper[4697]: I0220 18:04:16.376249 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/extract-content/0.log" Feb 20 18:04:16 crc kubenswrapper[4697]: I0220 18:04:16.526102 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/extract-utilities/0.log" Feb 20 18:04:16 crc kubenswrapper[4697]: I0220 18:04:16.543449 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/extract-content/0.log" Feb 20 18:04:16 crc kubenswrapper[4697]: I0220 18:04:16.632839 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/extract-utilities/0.log" Feb 20 18:04:16 crc kubenswrapper[4697]: I0220 18:04:16.692468 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/registry-server/0.log" Feb 20 18:04:16 crc kubenswrapper[4697]: I0220 18:04:16.814550 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/extract-utilities/0.log" Feb 20 18:04:16 crc kubenswrapper[4697]: I0220 18:04:16.862586 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/extract-content/0.log" Feb 20 18:04:16 crc kubenswrapper[4697]: I0220 18:04:16.899710 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/extract-content/0.log" Feb 20 18:04:17 crc kubenswrapper[4697]: I0220 18:04:17.020208 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/extract-utilities/0.log" Feb 20 18:04:17 crc kubenswrapper[4697]: I0220 18:04:17.035918 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/extract-content/0.log" Feb 20 18:04:17 crc kubenswrapper[4697]: I0220 18:04:17.604364 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/registry-server/0.log" Feb 20 18:04:29 crc kubenswrapper[4697]: I0220 18:04:29.966639 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-cqddv_214e6680-a3c8-4286-9c91-d68893ba73be/prometheus-operator/0.log" Feb 20 18:04:30 crc kubenswrapper[4697]: I0220 18:04:30.033832 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5_565bbe20-d8eb-4878-a048-4d78d8123f6d/prometheus-operator-admission-webhook/0.log" Feb 20 18:04:30 crc kubenswrapper[4697]: I0220 18:04:30.042815 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8679bd8497-phkqw_68c668ba-cbcc-4330-95e2-012c78108925/prometheus-operator-admission-webhook/0.log" Feb 20 18:04:30 crc kubenswrapper[4697]: I0220 18:04:30.166717 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8wmf5_12db7fd9-ce39-43cf-99b7-3a56791c0390/operator/0.log" Feb 20 18:04:30 crc kubenswrapper[4697]: I0220 18:04:30.205990 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-j5gtw_94f7dd38-b255-46cd-8b05-4c720857dd86/perses-operator/0.log" Feb 20 18:04:47 crc kubenswrapper[4697]: I0220 18:04:47.859994 4697 scope.go:117] "RemoveContainer" containerID="4cb4b53163b69db522c40e392a516b8911eced7243787a9c5dc0c8879a23b225" Feb 20 18:04:47 crc kubenswrapper[4697]: I0220 18:04:47.884709 4697 scope.go:117] "RemoveContainer" containerID="c728c54126ce607b96ef95eb0c95586937fc490e3bb8a282ae9a61b35f55801d" Feb 20 18:04:47 crc kubenswrapper[4697]: I0220 18:04:47.941285 4697 scope.go:117] "RemoveContainer" containerID="9dc36024961282dbaac4f3ece632211a4bda1152e21b2bb4609983dd494ce984" Feb 20 18:05:01 crc kubenswrapper[4697]: I0220 18:05:01.184976 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 18:05:01 crc kubenswrapper[4697]: I0220 18:05:01.185878 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 18:05:31 crc kubenswrapper[4697]: I0220 18:05:31.184926 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 18:05:31 crc kubenswrapper[4697]: I0220 18:05:31.185521 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 18:05:44 crc kubenswrapper[4697]: I0220 18:05:44.846921 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-98wth"] Feb 20 18:05:44 crc kubenswrapper[4697]: E0220 18:05:44.848096 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991ce506-ec9a-4718-bb8c-2cb3a3161bba" containerName="extract-utilities" Feb 20 18:05:44 crc kubenswrapper[4697]: I0220 18:05:44.848115 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="991ce506-ec9a-4718-bb8c-2cb3a3161bba" containerName="extract-utilities" Feb 20 18:05:44 crc kubenswrapper[4697]: E0220 18:05:44.848153 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991ce506-ec9a-4718-bb8c-2cb3a3161bba" containerName="registry-server" Feb 20 18:05:44 crc kubenswrapper[4697]: I0220 18:05:44.848161 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="991ce506-ec9a-4718-bb8c-2cb3a3161bba" containerName="registry-server" Feb 20 18:05:44 crc kubenswrapper[4697]: E0220 18:05:44.848183 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991ce506-ec9a-4718-bb8c-2cb3a3161bba" containerName="extract-content" Feb 20 18:05:44 crc kubenswrapper[4697]: I0220 18:05:44.848191 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="991ce506-ec9a-4718-bb8c-2cb3a3161bba" containerName="extract-content" Feb 20 18:05:44 crc kubenswrapper[4697]: I0220 18:05:44.848442 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="991ce506-ec9a-4718-bb8c-2cb3a3161bba" containerName="registry-server" Feb 20 18:05:44 crc kubenswrapper[4697]: I0220 18:05:44.850602 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:44 crc kubenswrapper[4697]: I0220 18:05:44.906790 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98wth"] Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.025822 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc55963-9326-459a-9594-7f06fbd6c697-utilities\") pod \"redhat-operators-98wth\" (UID: \"0fc55963-9326-459a-9594-7f06fbd6c697\") " pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.025948 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7565l"] Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.026091 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc55963-9326-459a-9594-7f06fbd6c697-catalog-content\") pod \"redhat-operators-98wth\" (UID: \"0fc55963-9326-459a-9594-7f06fbd6c697\") " pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.026331 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s424\" (UniqueName: \"kubernetes.io/projected/0fc55963-9326-459a-9594-7f06fbd6c697-kube-api-access-8s424\") pod \"redhat-operators-98wth\" (UID: \"0fc55963-9326-459a-9594-7f06fbd6c697\") " pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.029053 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.042256 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7565l"] Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.128360 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-catalog-content\") pod \"certified-operators-7565l\" (UID: \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\") " pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.133528 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc55963-9326-459a-9594-7f06fbd6c697-catalog-content\") pod \"redhat-operators-98wth\" (UID: \"0fc55963-9326-459a-9594-7f06fbd6c697\") " pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.133675 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-utilities\") pod \"certified-operators-7565l\" (UID: \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\") " pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.133824 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s424\" (UniqueName: \"kubernetes.io/projected/0fc55963-9326-459a-9594-7f06fbd6c697-kube-api-access-8s424\") pod \"redhat-operators-98wth\" (UID: \"0fc55963-9326-459a-9594-7f06fbd6c697\") " pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.133989 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc55963-9326-459a-9594-7f06fbd6c697-utilities\") pod \"redhat-operators-98wth\" (UID: \"0fc55963-9326-459a-9594-7f06fbd6c697\") " pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.134127 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q86tk\" (UniqueName: \"kubernetes.io/projected/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-kube-api-access-q86tk\") pod \"certified-operators-7565l\" (UID: \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\") " pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.134769 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc55963-9326-459a-9594-7f06fbd6c697-catalog-content\") pod \"redhat-operators-98wth\" (UID: \"0fc55963-9326-459a-9594-7f06fbd6c697\") " pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.135391 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc55963-9326-459a-9594-7f06fbd6c697-utilities\") pod \"redhat-operators-98wth\" (UID: \"0fc55963-9326-459a-9594-7f06fbd6c697\") " pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.169550 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s424\" (UniqueName: \"kubernetes.io/projected/0fc55963-9326-459a-9594-7f06fbd6c697-kube-api-access-8s424\") pod \"redhat-operators-98wth\" (UID: \"0fc55963-9326-459a-9594-7f06fbd6c697\") " pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.204922 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.236995 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q86tk\" (UniqueName: \"kubernetes.io/projected/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-kube-api-access-q86tk\") pod \"certified-operators-7565l\" (UID: \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\") " pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.237063 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-catalog-content\") pod \"certified-operators-7565l\" (UID: \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\") " pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.237116 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-utilities\") pod \"certified-operators-7565l\" (UID: \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\") " pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.237555 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-utilities\") pod \"certified-operators-7565l\" (UID: \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\") " pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.239814 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-catalog-content\") pod \"certified-operators-7565l\" (UID: \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\") " pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.257424 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q86tk\" (UniqueName: \"kubernetes.io/projected/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-kube-api-access-q86tk\") pod \"certified-operators-7565l\" (UID: \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\") " pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.346161 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.745549 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98wth"] Feb 20 18:05:45 crc kubenswrapper[4697]: I0220 18:05:45.977200 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7565l"] Feb 20 18:05:46 crc kubenswrapper[4697]: I0220 18:05:46.298628 4697 generic.go:334] "Generic (PLEG): container finished" podID="f2df1468-57f5-474e-8922-5a1b7b0ab3f2" containerID="f7228017eaaa03e7f81f3dc5fccd9f7a5bffd833137eb891f0547056caecbe86" exitCode=0 Feb 20 18:05:46 crc kubenswrapper[4697]: I0220 18:05:46.298700 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7565l" event={"ID":"f2df1468-57f5-474e-8922-5a1b7b0ab3f2","Type":"ContainerDied","Data":"f7228017eaaa03e7f81f3dc5fccd9f7a5bffd833137eb891f0547056caecbe86"} Feb 20 18:05:46 crc kubenswrapper[4697]: I0220 18:05:46.298727 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7565l" event={"ID":"f2df1468-57f5-474e-8922-5a1b7b0ab3f2","Type":"ContainerStarted","Data":"f4d9fcdc73e913814e31c8f27f5b88a48d02bf70fd4e15b2c552735facb2804c"} Feb 20 18:05:46 crc kubenswrapper[4697]: I0220 18:05:46.300927 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 18:05:46 crc kubenswrapper[4697]: I0220 18:05:46.301095 4697 generic.go:334] "Generic (PLEG): container finished" podID="0fc55963-9326-459a-9594-7f06fbd6c697" containerID="e9ad9622a38e775dc0b466e7d15fa2f00a4ddc676fffe4a3d07d12bc092f1da7" exitCode=0 Feb 20 18:05:46 crc kubenswrapper[4697]: I0220 18:05:46.301149 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98wth" event={"ID":"0fc55963-9326-459a-9594-7f06fbd6c697","Type":"ContainerDied","Data":"e9ad9622a38e775dc0b466e7d15fa2f00a4ddc676fffe4a3d07d12bc092f1da7"} Feb 20 18:05:46 crc kubenswrapper[4697]: I0220 18:05:46.301182 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98wth" event={"ID":"0fc55963-9326-459a-9594-7f06fbd6c697","Type":"ContainerStarted","Data":"663b4cc75778d34cf5b9fef24e4848af227ee5a9e7ffeea995ac6bf67d682ed6"} Feb 20 18:05:48 crc kubenswrapper[4697]: I0220 18:05:48.325900 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7565l" event={"ID":"f2df1468-57f5-474e-8922-5a1b7b0ab3f2","Type":"ContainerStarted","Data":"60430c25d7dec07b9f30f1ba422607f1055989ffe843054d7ac799415e8b6a6c"} Feb 20 18:05:48 crc kubenswrapper[4697]: I0220 18:05:48.328625 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98wth" event={"ID":"0fc55963-9326-459a-9594-7f06fbd6c697","Type":"ContainerStarted","Data":"0f8b2fb6d34d8b6e5069a56431e7318eedf0aae71644fbbaed616ea7ab7e166c"} Feb 20 18:05:49 crc kubenswrapper[4697]: I0220 18:05:49.339638 4697 generic.go:334] "Generic (PLEG): container finished" podID="f2df1468-57f5-474e-8922-5a1b7b0ab3f2" containerID="60430c25d7dec07b9f30f1ba422607f1055989ffe843054d7ac799415e8b6a6c" exitCode=0 Feb 20 18:05:49 crc kubenswrapper[4697]: I0220 18:05:49.339713 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7565l" event={"ID":"f2df1468-57f5-474e-8922-5a1b7b0ab3f2","Type":"ContainerDied","Data":"60430c25d7dec07b9f30f1ba422607f1055989ffe843054d7ac799415e8b6a6c"} Feb 20 18:05:50 crc kubenswrapper[4697]: I0220 18:05:50.351578 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7565l" event={"ID":"f2df1468-57f5-474e-8922-5a1b7b0ab3f2","Type":"ContainerStarted","Data":"99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6"} Feb 20 18:05:50 crc kubenswrapper[4697]: I0220 18:05:50.374150 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7565l" podStartSLOduration=1.7450878680000002 podStartE2EDuration="5.37413285s" podCreationTimestamp="2026-02-20 18:05:45 +0000 UTC" firstStartedPulling="2026-02-20 18:05:46.300728028 +0000 UTC m=+5654.080773436" lastFinishedPulling="2026-02-20 18:05:49.929773 +0000 UTC m=+5657.709818418" observedRunningTime="2026-02-20 18:05:50.366820422 +0000 UTC m=+5658.146865870" watchObservedRunningTime="2026-02-20 18:05:50.37413285 +0000 UTC m=+5658.154178258" Feb 20 18:05:53 crc kubenswrapper[4697]: I0220 18:05:53.471926 4697 generic.go:334] "Generic (PLEG): container finished" podID="0fc55963-9326-459a-9594-7f06fbd6c697" containerID="0f8b2fb6d34d8b6e5069a56431e7318eedf0aae71644fbbaed616ea7ab7e166c" exitCode=0 Feb 20 18:05:53 crc kubenswrapper[4697]: I0220 18:05:53.472222 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98wth" event={"ID":"0fc55963-9326-459a-9594-7f06fbd6c697","Type":"ContainerDied","Data":"0f8b2fb6d34d8b6e5069a56431e7318eedf0aae71644fbbaed616ea7ab7e166c"} Feb 20 18:05:54 crc kubenswrapper[4697]: I0220 18:05:54.482091 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98wth" event={"ID":"0fc55963-9326-459a-9594-7f06fbd6c697","Type":"ContainerStarted","Data":"3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883"} Feb 20 18:05:54 crc kubenswrapper[4697]: I0220 18:05:54.509577 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-98wth" podStartSLOduration=2.735537425 podStartE2EDuration="10.509555704s" podCreationTimestamp="2026-02-20 18:05:44 +0000 UTC" firstStartedPulling="2026-02-20 18:05:46.302764678 +0000 UTC m=+5654.082810086" lastFinishedPulling="2026-02-20 18:05:54.076782957 +0000 UTC m=+5661.856828365" observedRunningTime="2026-02-20 18:05:54.500491123 +0000 UTC m=+5662.280536531" watchObservedRunningTime="2026-02-20 18:05:54.509555704 +0000 UTC m=+5662.289601112" Feb 20 18:05:55 crc kubenswrapper[4697]: I0220 18:05:55.205897 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:55 crc kubenswrapper[4697]: I0220 18:05:55.205963 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:05:55 crc kubenswrapper[4697]: I0220 18:05:55.348362 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:55 crc kubenswrapper[4697]: I0220 18:05:55.349274 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:55 crc kubenswrapper[4697]: I0220 18:05:55.440196 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:55 crc kubenswrapper[4697]: I0220 18:05:55.566250 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:56 crc kubenswrapper[4697]: I0220 18:05:56.218837 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7565l"] Feb 20 18:05:56 crc kubenswrapper[4697]: I0220 18:05:56.255013 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-98wth" podUID="0fc55963-9326-459a-9594-7f06fbd6c697" containerName="registry-server" probeResult="failure" output=< Feb 20 18:05:56 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 18:05:56 crc kubenswrapper[4697]: > Feb 20 18:05:57 crc kubenswrapper[4697]: I0220 18:05:57.512056 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7565l" podUID="f2df1468-57f5-474e-8922-5a1b7b0ab3f2" containerName="registry-server" containerID="cri-o://99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6" gracePeriod=2 Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.075578 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.207804 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-utilities\") pod \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\" (UID: \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\") " Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.207900 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q86tk\" (UniqueName: \"kubernetes.io/projected/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-kube-api-access-q86tk\") pod \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\" (UID: \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\") " Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.208033 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-catalog-content\") pod \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\" (UID: \"f2df1468-57f5-474e-8922-5a1b7b0ab3f2\") " Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.209050 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-utilities" (OuterVolumeSpecName: "utilities") pod "f2df1468-57f5-474e-8922-5a1b7b0ab3f2" (UID: "f2df1468-57f5-474e-8922-5a1b7b0ab3f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.246704 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-kube-api-access-q86tk" (OuterVolumeSpecName: "kube-api-access-q86tk") pod "f2df1468-57f5-474e-8922-5a1b7b0ab3f2" (UID: "f2df1468-57f5-474e-8922-5a1b7b0ab3f2"). InnerVolumeSpecName "kube-api-access-q86tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.284007 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2df1468-57f5-474e-8922-5a1b7b0ab3f2" (UID: "f2df1468-57f5-474e-8922-5a1b7b0ab3f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.310775 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.310807 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q86tk\" (UniqueName: \"kubernetes.io/projected/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-kube-api-access-q86tk\") on node \"crc\" DevicePath \"\"" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.310818 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2df1468-57f5-474e-8922-5a1b7b0ab3f2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.522162 4697 generic.go:334] "Generic (PLEG): container finished" podID="f2df1468-57f5-474e-8922-5a1b7b0ab3f2" containerID="99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6" exitCode=0 Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.522204 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7565l" event={"ID":"f2df1468-57f5-474e-8922-5a1b7b0ab3f2","Type":"ContainerDied","Data":"99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6"} Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.522229 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7565l" event={"ID":"f2df1468-57f5-474e-8922-5a1b7b0ab3f2","Type":"ContainerDied","Data":"f4d9fcdc73e913814e31c8f27f5b88a48d02bf70fd4e15b2c552735facb2804c"} Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.522245 4697 scope.go:117] "RemoveContainer" containerID="99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.522371 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7565l" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.557148 4697 scope.go:117] "RemoveContainer" containerID="60430c25d7dec07b9f30f1ba422607f1055989ffe843054d7ac799415e8b6a6c" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.558464 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7565l"] Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.575150 4697 scope.go:117] "RemoveContainer" containerID="f7228017eaaa03e7f81f3dc5fccd9f7a5bffd833137eb891f0547056caecbe86" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.585011 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7565l"] Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.636372 4697 scope.go:117] "RemoveContainer" containerID="99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6" Feb 20 18:05:58 crc kubenswrapper[4697]: E0220 18:05:58.636851 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6\": container with ID starting with 99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6 not found: ID does not exist" containerID="99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.636885 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6"} err="failed to get container status \"99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6\": rpc error: code = NotFound desc = could not find container \"99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6\": container with ID starting with 99805f1d101f93b1cbd455ea91dfad310ee22d8e12b4fbba865c73370da00ad6 not found: ID does not exist" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.636908 4697 scope.go:117] "RemoveContainer" containerID="60430c25d7dec07b9f30f1ba422607f1055989ffe843054d7ac799415e8b6a6c" Feb 20 18:05:58 crc kubenswrapper[4697]: E0220 18:05:58.637825 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60430c25d7dec07b9f30f1ba422607f1055989ffe843054d7ac799415e8b6a6c\": container with ID starting with 60430c25d7dec07b9f30f1ba422607f1055989ffe843054d7ac799415e8b6a6c not found: ID does not exist" containerID="60430c25d7dec07b9f30f1ba422607f1055989ffe843054d7ac799415e8b6a6c" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.637861 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60430c25d7dec07b9f30f1ba422607f1055989ffe843054d7ac799415e8b6a6c"} err="failed to get container status \"60430c25d7dec07b9f30f1ba422607f1055989ffe843054d7ac799415e8b6a6c\": rpc error: code = NotFound desc = could not find container \"60430c25d7dec07b9f30f1ba422607f1055989ffe843054d7ac799415e8b6a6c\": container with ID starting with 60430c25d7dec07b9f30f1ba422607f1055989ffe843054d7ac799415e8b6a6c not found: ID does not exist" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.637880 4697 scope.go:117] "RemoveContainer" containerID="f7228017eaaa03e7f81f3dc5fccd9f7a5bffd833137eb891f0547056caecbe86" Feb 20 18:05:58 crc kubenswrapper[4697]: E0220 18:05:58.638261 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7228017eaaa03e7f81f3dc5fccd9f7a5bffd833137eb891f0547056caecbe86\": container with ID starting with f7228017eaaa03e7f81f3dc5fccd9f7a5bffd833137eb891f0547056caecbe86 not found: ID does not exist" containerID="f7228017eaaa03e7f81f3dc5fccd9f7a5bffd833137eb891f0547056caecbe86" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.638296 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7228017eaaa03e7f81f3dc5fccd9f7a5bffd833137eb891f0547056caecbe86"} err="failed to get container status \"f7228017eaaa03e7f81f3dc5fccd9f7a5bffd833137eb891f0547056caecbe86\": rpc error: code = NotFound desc = could not find container \"f7228017eaaa03e7f81f3dc5fccd9f7a5bffd833137eb891f0547056caecbe86\": container with ID starting with f7228017eaaa03e7f81f3dc5fccd9f7a5bffd833137eb891f0547056caecbe86 not found: ID does not exist" Feb 20 18:05:58 crc kubenswrapper[4697]: I0220 18:05:58.893262 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2df1468-57f5-474e-8922-5a1b7b0ab3f2" path="/var/lib/kubelet/pods/f2df1468-57f5-474e-8922-5a1b7b0ab3f2/volumes" Feb 20 18:06:01 crc kubenswrapper[4697]: I0220 18:06:01.184950 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 18:06:01 crc kubenswrapper[4697]: I0220 18:06:01.187086 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 18:06:01 crc kubenswrapper[4697]: I0220 18:06:01.187309 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 18:06:01 crc kubenswrapper[4697]: I0220 18:06:01.188761 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 18:06:01 crc kubenswrapper[4697]: I0220 18:06:01.189101 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" gracePeriod=600 Feb 20 18:06:01 crc kubenswrapper[4697]: E0220 18:06:01.315292 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:06:01 crc kubenswrapper[4697]: I0220 18:06:01.570830 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" exitCode=0 Feb 20 18:06:01 crc kubenswrapper[4697]: I0220 18:06:01.570892 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6"} Feb 20 18:06:01 crc kubenswrapper[4697]: I0220 18:06:01.570940 4697 scope.go:117] "RemoveContainer" containerID="f70c8b9f6708d91f911ce6e63dfc4940e276016d2a4151ff12953e15e5419aa9" Feb 20 18:06:01 crc kubenswrapper[4697]: I0220 18:06:01.572308 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:06:01 crc kubenswrapper[4697]: E0220 18:06:01.572853 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:06:05 crc kubenswrapper[4697]: I0220 18:06:05.283861 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:06:05 crc kubenswrapper[4697]: I0220 18:06:05.383687 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:06:05 crc kubenswrapper[4697]: I0220 18:06:05.534061 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98wth"] Feb 20 18:06:05 crc kubenswrapper[4697]: E0220 18:06:05.943995 4697 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Feb 20 18:06:06 crc kubenswrapper[4697]: I0220 18:06:06.618057 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-98wth" podUID="0fc55963-9326-459a-9594-7f06fbd6c697" containerName="registry-server" containerID="cri-o://3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883" gracePeriod=2 Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.148787 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.212299 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc55963-9326-459a-9594-7f06fbd6c697-utilities\") pod \"0fc55963-9326-459a-9594-7f06fbd6c697\" (UID: \"0fc55963-9326-459a-9594-7f06fbd6c697\") " Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.212513 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s424\" (UniqueName: \"kubernetes.io/projected/0fc55963-9326-459a-9594-7f06fbd6c697-kube-api-access-8s424\") pod \"0fc55963-9326-459a-9594-7f06fbd6c697\" (UID: \"0fc55963-9326-459a-9594-7f06fbd6c697\") " Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.212600 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc55963-9326-459a-9594-7f06fbd6c697-catalog-content\") pod \"0fc55963-9326-459a-9594-7f06fbd6c697\" (UID: \"0fc55963-9326-459a-9594-7f06fbd6c697\") " Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.213756 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fc55963-9326-459a-9594-7f06fbd6c697-utilities" (OuterVolumeSpecName: "utilities") pod "0fc55963-9326-459a-9594-7f06fbd6c697" (UID: "0fc55963-9326-459a-9594-7f06fbd6c697"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.219459 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fc55963-9326-459a-9594-7f06fbd6c697-kube-api-access-8s424" (OuterVolumeSpecName: "kube-api-access-8s424") pod "0fc55963-9326-459a-9594-7f06fbd6c697" (UID: "0fc55963-9326-459a-9594-7f06fbd6c697"). InnerVolumeSpecName "kube-api-access-8s424". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.314998 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc55963-9326-459a-9594-7f06fbd6c697-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.315035 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s424\" (UniqueName: \"kubernetes.io/projected/0fc55963-9326-459a-9594-7f06fbd6c697-kube-api-access-8s424\") on node \"crc\" DevicePath \"\"" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.346419 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fc55963-9326-459a-9594-7f06fbd6c697-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fc55963-9326-459a-9594-7f06fbd6c697" (UID: "0fc55963-9326-459a-9594-7f06fbd6c697"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.417799 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc55963-9326-459a-9594-7f06fbd6c697-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.629789 4697 generic.go:334] "Generic (PLEG): container finished" podID="0fc55963-9326-459a-9594-7f06fbd6c697" containerID="3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883" exitCode=0 Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.629828 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98wth" event={"ID":"0fc55963-9326-459a-9594-7f06fbd6c697","Type":"ContainerDied","Data":"3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883"} Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.629840 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98wth" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.630692 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98wth" event={"ID":"0fc55963-9326-459a-9594-7f06fbd6c697","Type":"ContainerDied","Data":"663b4cc75778d34cf5b9fef24e4848af227ee5a9e7ffeea995ac6bf67d682ed6"} Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.630825 4697 scope.go:117] "RemoveContainer" containerID="3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.666381 4697 scope.go:117] "RemoveContainer" containerID="0f8b2fb6d34d8b6e5069a56431e7318eedf0aae71644fbbaed616ea7ab7e166c" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.682693 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98wth"] Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.698837 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-98wth"] Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.700523 4697 scope.go:117] "RemoveContainer" containerID="e9ad9622a38e775dc0b466e7d15fa2f00a4ddc676fffe4a3d07d12bc092f1da7" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.771502 4697 scope.go:117] "RemoveContainer" containerID="3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883" Feb 20 18:06:07 crc kubenswrapper[4697]: E0220 18:06:07.771995 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883\": container with ID starting with 3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883 not found: ID does not exist" containerID="3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.772028 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883"} err="failed to get container status \"3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883\": rpc error: code = NotFound desc = could not find container \"3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883\": container with ID starting with 3d107963cc0753458792e3ea850274c3317ccdce5dd0adb3cba628a0f12a5883 not found: ID does not exist" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.772047 4697 scope.go:117] "RemoveContainer" containerID="0f8b2fb6d34d8b6e5069a56431e7318eedf0aae71644fbbaed616ea7ab7e166c" Feb 20 18:06:07 crc kubenswrapper[4697]: E0220 18:06:07.781719 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f8b2fb6d34d8b6e5069a56431e7318eedf0aae71644fbbaed616ea7ab7e166c\": container with ID starting with 0f8b2fb6d34d8b6e5069a56431e7318eedf0aae71644fbbaed616ea7ab7e166c not found: ID does not exist" containerID="0f8b2fb6d34d8b6e5069a56431e7318eedf0aae71644fbbaed616ea7ab7e166c" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.781751 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f8b2fb6d34d8b6e5069a56431e7318eedf0aae71644fbbaed616ea7ab7e166c"} err="failed to get container status \"0f8b2fb6d34d8b6e5069a56431e7318eedf0aae71644fbbaed616ea7ab7e166c\": rpc error: code = NotFound desc = could not find container \"0f8b2fb6d34d8b6e5069a56431e7318eedf0aae71644fbbaed616ea7ab7e166c\": container with ID starting with 0f8b2fb6d34d8b6e5069a56431e7318eedf0aae71644fbbaed616ea7ab7e166c not found: ID does not exist" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.781770 4697 scope.go:117] "RemoveContainer" containerID="e9ad9622a38e775dc0b466e7d15fa2f00a4ddc676fffe4a3d07d12bc092f1da7" Feb 20 18:06:07 crc kubenswrapper[4697]: E0220 18:06:07.782525 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9ad9622a38e775dc0b466e7d15fa2f00a4ddc676fffe4a3d07d12bc092f1da7\": container with ID starting with e9ad9622a38e775dc0b466e7d15fa2f00a4ddc676fffe4a3d07d12bc092f1da7 not found: ID does not exist" containerID="e9ad9622a38e775dc0b466e7d15fa2f00a4ddc676fffe4a3d07d12bc092f1da7" Feb 20 18:06:07 crc kubenswrapper[4697]: I0220 18:06:07.782605 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9ad9622a38e775dc0b466e7d15fa2f00a4ddc676fffe4a3d07d12bc092f1da7"} err="failed to get container status \"e9ad9622a38e775dc0b466e7d15fa2f00a4ddc676fffe4a3d07d12bc092f1da7\": rpc error: code = NotFound desc = could not find container \"e9ad9622a38e775dc0b466e7d15fa2f00a4ddc676fffe4a3d07d12bc092f1da7\": container with ID starting with e9ad9622a38e775dc0b466e7d15fa2f00a4ddc676fffe4a3d07d12bc092f1da7 not found: ID does not exist" Feb 20 18:06:08 crc kubenswrapper[4697]: I0220 18:06:08.898214 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fc55963-9326-459a-9594-7f06fbd6c697" path="/var/lib/kubelet/pods/0fc55963-9326-459a-9594-7f06fbd6c697/volumes" Feb 20 18:06:14 crc kubenswrapper[4697]: I0220 18:06:14.884975 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:06:14 crc kubenswrapper[4697]: E0220 18:06:14.885981 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:06:26 crc kubenswrapper[4697]: I0220 18:06:26.877230 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:06:26 crc kubenswrapper[4697]: E0220 18:06:26.877942 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:06:37 crc kubenswrapper[4697]: I0220 18:06:37.430497 4697 generic.go:334] "Generic (PLEG): container finished" podID="35308998-f1d9-472f-b649-2a7623cf6987" containerID="19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5" exitCode=0 Feb 20 18:06:37 crc kubenswrapper[4697]: I0220 18:06:37.430619 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w8dxn/must-gather-xb7hq" event={"ID":"35308998-f1d9-472f-b649-2a7623cf6987","Type":"ContainerDied","Data":"19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5"} Feb 20 18:06:37 crc kubenswrapper[4697]: I0220 18:06:37.431796 4697 scope.go:117] "RemoveContainer" containerID="19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5" Feb 20 18:06:38 crc kubenswrapper[4697]: I0220 18:06:38.387371 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w8dxn_must-gather-xb7hq_35308998-f1d9-472f-b649-2a7623cf6987/gather/0.log" Feb 20 18:06:40 crc kubenswrapper[4697]: I0220 18:06:40.876825 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:06:40 crc kubenswrapper[4697]: E0220 18:06:40.877258 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:06:46 crc kubenswrapper[4697]: I0220 18:06:46.952504 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w8dxn/must-gather-xb7hq"] Feb 20 18:06:46 crc kubenswrapper[4697]: I0220 18:06:46.953338 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-w8dxn/must-gather-xb7hq" podUID="35308998-f1d9-472f-b649-2a7623cf6987" containerName="copy" containerID="cri-o://cad8c9e025f135fd9863aea41a45b054db6592416854aa82c98cb30f3cbb2bd4" gracePeriod=2 Feb 20 18:06:46 crc kubenswrapper[4697]: I0220 18:06:46.973204 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w8dxn/must-gather-xb7hq"] Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.407636 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w8dxn_must-gather-xb7hq_35308998-f1d9-472f-b649-2a7623cf6987/copy/0.log" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.408145 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/must-gather-xb7hq" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.477686 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/35308998-f1d9-472f-b649-2a7623cf6987-must-gather-output\") pod \"35308998-f1d9-472f-b649-2a7623cf6987\" (UID: \"35308998-f1d9-472f-b649-2a7623cf6987\") " Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.477929 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlpqs\" (UniqueName: \"kubernetes.io/projected/35308998-f1d9-472f-b649-2a7623cf6987-kube-api-access-zlpqs\") pod \"35308998-f1d9-472f-b649-2a7623cf6987\" (UID: \"35308998-f1d9-472f-b649-2a7623cf6987\") " Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.483662 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35308998-f1d9-472f-b649-2a7623cf6987-kube-api-access-zlpqs" (OuterVolumeSpecName: "kube-api-access-zlpqs") pod "35308998-f1d9-472f-b649-2a7623cf6987" (UID: "35308998-f1d9-472f-b649-2a7623cf6987"). InnerVolumeSpecName "kube-api-access-zlpqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.523161 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w8dxn_must-gather-xb7hq_35308998-f1d9-472f-b649-2a7623cf6987/copy/0.log" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.523815 4697 generic.go:334] "Generic (PLEG): container finished" podID="35308998-f1d9-472f-b649-2a7623cf6987" containerID="cad8c9e025f135fd9863aea41a45b054db6592416854aa82c98cb30f3cbb2bd4" exitCode=143 Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.523871 4697 scope.go:117] "RemoveContainer" containerID="cad8c9e025f135fd9863aea41a45b054db6592416854aa82c98cb30f3cbb2bd4" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.523993 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w8dxn/must-gather-xb7hq" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.561852 4697 scope.go:117] "RemoveContainer" containerID="19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.580931 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlpqs\" (UniqueName: \"kubernetes.io/projected/35308998-f1d9-472f-b649-2a7623cf6987-kube-api-access-zlpqs\") on node \"crc\" DevicePath \"\"" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.639410 4697 scope.go:117] "RemoveContainer" containerID="cad8c9e025f135fd9863aea41a45b054db6592416854aa82c98cb30f3cbb2bd4" Feb 20 18:06:47 crc kubenswrapper[4697]: E0220 18:06:47.640018 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cad8c9e025f135fd9863aea41a45b054db6592416854aa82c98cb30f3cbb2bd4\": container with ID starting with cad8c9e025f135fd9863aea41a45b054db6592416854aa82c98cb30f3cbb2bd4 not found: ID does not exist" containerID="cad8c9e025f135fd9863aea41a45b054db6592416854aa82c98cb30f3cbb2bd4" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.640085 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad8c9e025f135fd9863aea41a45b054db6592416854aa82c98cb30f3cbb2bd4"} err="failed to get container status \"cad8c9e025f135fd9863aea41a45b054db6592416854aa82c98cb30f3cbb2bd4\": rpc error: code = NotFound desc = could not find container \"cad8c9e025f135fd9863aea41a45b054db6592416854aa82c98cb30f3cbb2bd4\": container with ID starting with cad8c9e025f135fd9863aea41a45b054db6592416854aa82c98cb30f3cbb2bd4 not found: ID does not exist" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.640117 4697 scope.go:117] "RemoveContainer" containerID="19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5" Feb 20 18:06:47 crc kubenswrapper[4697]: E0220 18:06:47.640707 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5\": container with ID starting with 19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5 not found: ID does not exist" containerID="19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.640738 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5"} err="failed to get container status \"19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5\": rpc error: code = NotFound desc = could not find container \"19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5\": container with ID starting with 19a4faaa692eac28f2ed1f251b5baf433094ae224cc69a1829515856a997bfb5 not found: ID does not exist" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.687054 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35308998-f1d9-472f-b649-2a7623cf6987-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "35308998-f1d9-472f-b649-2a7623cf6987" (UID: "35308998-f1d9-472f-b649-2a7623cf6987"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:06:47 crc kubenswrapper[4697]: I0220 18:06:47.788136 4697 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/35308998-f1d9-472f-b649-2a7623cf6987-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 20 18:06:48 crc kubenswrapper[4697]: I0220 18:06:48.887116 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35308998-f1d9-472f-b649-2a7623cf6987" path="/var/lib/kubelet/pods/35308998-f1d9-472f-b649-2a7623cf6987/volumes" Feb 20 18:06:52 crc kubenswrapper[4697]: I0220 18:06:52.886890 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:06:52 crc kubenswrapper[4697]: E0220 18:06:52.888375 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:07:06 crc kubenswrapper[4697]: I0220 18:07:06.877800 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:07:06 crc kubenswrapper[4697]: E0220 18:07:06.879269 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:07:17 crc kubenswrapper[4697]: I0220 18:07:17.878146 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:07:17 crc kubenswrapper[4697]: E0220 18:07:17.879002 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:07:30 crc kubenswrapper[4697]: I0220 18:07:30.877797 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:07:30 crc kubenswrapper[4697]: E0220 18:07:30.879040 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:07:43 crc kubenswrapper[4697]: I0220 18:07:43.878196 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:07:43 crc kubenswrapper[4697]: E0220 18:07:43.878949 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:07:58 crc kubenswrapper[4697]: I0220 18:07:58.877784 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:07:58 crc kubenswrapper[4697]: E0220 18:07:58.878753 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:08:12 crc kubenswrapper[4697]: I0220 18:08:12.883265 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:08:12 crc kubenswrapper[4697]: E0220 18:08:12.884178 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:08:23 crc kubenswrapper[4697]: I0220 18:08:23.876793 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:08:23 crc kubenswrapper[4697]: E0220 18:08:23.877989 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.077895 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7c429"] Feb 20 18:08:28 crc kubenswrapper[4697]: E0220 18:08:28.078841 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2df1468-57f5-474e-8922-5a1b7b0ab3f2" containerName="extract-content" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.078853 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2df1468-57f5-474e-8922-5a1b7b0ab3f2" containerName="extract-content" Feb 20 18:08:28 crc kubenswrapper[4697]: E0220 18:08:28.078868 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2df1468-57f5-474e-8922-5a1b7b0ab3f2" containerName="registry-server" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.078874 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2df1468-57f5-474e-8922-5a1b7b0ab3f2" containerName="registry-server" Feb 20 18:08:28 crc kubenswrapper[4697]: E0220 18:08:28.078888 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2df1468-57f5-474e-8922-5a1b7b0ab3f2" containerName="extract-utilities" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.078897 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2df1468-57f5-474e-8922-5a1b7b0ab3f2" containerName="extract-utilities" Feb 20 18:08:28 crc kubenswrapper[4697]: E0220 18:08:28.078907 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc55963-9326-459a-9594-7f06fbd6c697" containerName="extract-utilities" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.078912 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc55963-9326-459a-9594-7f06fbd6c697" containerName="extract-utilities" Feb 20 18:08:28 crc kubenswrapper[4697]: E0220 18:08:28.078925 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35308998-f1d9-472f-b649-2a7623cf6987" containerName="gather" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.078931 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="35308998-f1d9-472f-b649-2a7623cf6987" containerName="gather" Feb 20 18:08:28 crc kubenswrapper[4697]: E0220 18:08:28.078945 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc55963-9326-459a-9594-7f06fbd6c697" containerName="registry-server" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.078950 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc55963-9326-459a-9594-7f06fbd6c697" containerName="registry-server" Feb 20 18:08:28 crc kubenswrapper[4697]: E0220 18:08:28.078964 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35308998-f1d9-472f-b649-2a7623cf6987" containerName="copy" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.078970 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="35308998-f1d9-472f-b649-2a7623cf6987" containerName="copy" Feb 20 18:08:28 crc kubenswrapper[4697]: E0220 18:08:28.078994 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc55963-9326-459a-9594-7f06fbd6c697" containerName="extract-content" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.079000 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc55963-9326-459a-9594-7f06fbd6c697" containerName="extract-content" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.079208 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc55963-9326-459a-9594-7f06fbd6c697" containerName="registry-server" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.079229 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="35308998-f1d9-472f-b649-2a7623cf6987" containerName="copy" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.079243 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2df1468-57f5-474e-8922-5a1b7b0ab3f2" containerName="registry-server" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.079255 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="35308998-f1d9-472f-b649-2a7623cf6987" containerName="gather" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.080772 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.091476 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7c429"] Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.252158 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a9220c-6e88-4ad3-a683-c6024165c63d-utilities\") pod \"redhat-marketplace-7c429\" (UID: \"62a9220c-6e88-4ad3-a683-c6024165c63d\") " pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.252263 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a9220c-6e88-4ad3-a683-c6024165c63d-catalog-content\") pod \"redhat-marketplace-7c429\" (UID: \"62a9220c-6e88-4ad3-a683-c6024165c63d\") " pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.252397 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts8hx\" (UniqueName: \"kubernetes.io/projected/62a9220c-6e88-4ad3-a683-c6024165c63d-kube-api-access-ts8hx\") pod \"redhat-marketplace-7c429\" (UID: \"62a9220c-6e88-4ad3-a683-c6024165c63d\") " pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.354629 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts8hx\" (UniqueName: \"kubernetes.io/projected/62a9220c-6e88-4ad3-a683-c6024165c63d-kube-api-access-ts8hx\") pod \"redhat-marketplace-7c429\" (UID: \"62a9220c-6e88-4ad3-a683-c6024165c63d\") " pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.354740 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a9220c-6e88-4ad3-a683-c6024165c63d-utilities\") pod \"redhat-marketplace-7c429\" (UID: \"62a9220c-6e88-4ad3-a683-c6024165c63d\") " pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.354798 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a9220c-6e88-4ad3-a683-c6024165c63d-catalog-content\") pod \"redhat-marketplace-7c429\" (UID: \"62a9220c-6e88-4ad3-a683-c6024165c63d\") " pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.355320 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a9220c-6e88-4ad3-a683-c6024165c63d-utilities\") pod \"redhat-marketplace-7c429\" (UID: \"62a9220c-6e88-4ad3-a683-c6024165c63d\") " pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.355385 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a9220c-6e88-4ad3-a683-c6024165c63d-catalog-content\") pod \"redhat-marketplace-7c429\" (UID: \"62a9220c-6e88-4ad3-a683-c6024165c63d\") " pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.377089 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts8hx\" (UniqueName: \"kubernetes.io/projected/62a9220c-6e88-4ad3-a683-c6024165c63d-kube-api-access-ts8hx\") pod \"redhat-marketplace-7c429\" (UID: \"62a9220c-6e88-4ad3-a683-c6024165c63d\") " pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.409372 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:28 crc kubenswrapper[4697]: I0220 18:08:28.916707 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7c429"] Feb 20 18:08:29 crc kubenswrapper[4697]: I0220 18:08:29.557103 4697 generic.go:334] "Generic (PLEG): container finished" podID="62a9220c-6e88-4ad3-a683-c6024165c63d" containerID="896cc48187b7dc286e94d859834c51ef6f2aab3bd6b37e964c356899cbb58101" exitCode=0 Feb 20 18:08:29 crc kubenswrapper[4697]: I0220 18:08:29.557412 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7c429" event={"ID":"62a9220c-6e88-4ad3-a683-c6024165c63d","Type":"ContainerDied","Data":"896cc48187b7dc286e94d859834c51ef6f2aab3bd6b37e964c356899cbb58101"} Feb 20 18:08:29 crc kubenswrapper[4697]: I0220 18:08:29.557461 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7c429" event={"ID":"62a9220c-6e88-4ad3-a683-c6024165c63d","Type":"ContainerStarted","Data":"a234e81175ec54b9540b9171aecc28031dbfac67424729209bf04937006276f0"} Feb 20 18:08:30 crc kubenswrapper[4697]: I0220 18:08:30.568652 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7c429" event={"ID":"62a9220c-6e88-4ad3-a683-c6024165c63d","Type":"ContainerStarted","Data":"a9dba87e65b4c8469acee646aed22c8c9ccf04d047fbfa36e5b040791ba47da3"} Feb 20 18:08:31 crc kubenswrapper[4697]: I0220 18:08:31.577740 4697 generic.go:334] "Generic (PLEG): container finished" podID="62a9220c-6e88-4ad3-a683-c6024165c63d" containerID="a9dba87e65b4c8469acee646aed22c8c9ccf04d047fbfa36e5b040791ba47da3" exitCode=0 Feb 20 18:08:31 crc kubenswrapper[4697]: I0220 18:08:31.577783 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7c429" event={"ID":"62a9220c-6e88-4ad3-a683-c6024165c63d","Type":"ContainerDied","Data":"a9dba87e65b4c8469acee646aed22c8c9ccf04d047fbfa36e5b040791ba47da3"} Feb 20 18:08:32 crc kubenswrapper[4697]: I0220 18:08:32.589074 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7c429" event={"ID":"62a9220c-6e88-4ad3-a683-c6024165c63d","Type":"ContainerStarted","Data":"ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b"} Feb 20 18:08:32 crc kubenswrapper[4697]: I0220 18:08:32.622385 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7c429" podStartSLOduration=2.195084067 podStartE2EDuration="4.622368108s" podCreationTimestamp="2026-02-20 18:08:28 +0000 UTC" firstStartedPulling="2026-02-20 18:08:29.559126804 +0000 UTC m=+5817.339172222" lastFinishedPulling="2026-02-20 18:08:31.986410845 +0000 UTC m=+5819.766456263" observedRunningTime="2026-02-20 18:08:32.615596221 +0000 UTC m=+5820.395641629" watchObservedRunningTime="2026-02-20 18:08:32.622368108 +0000 UTC m=+5820.402413516" Feb 20 18:08:34 crc kubenswrapper[4697]: I0220 18:08:34.877752 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:08:34 crc kubenswrapper[4697]: E0220 18:08:34.878324 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:08:38 crc kubenswrapper[4697]: I0220 18:08:38.409856 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:38 crc kubenswrapper[4697]: I0220 18:08:38.410380 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:38 crc kubenswrapper[4697]: I0220 18:08:38.474965 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:38 crc kubenswrapper[4697]: I0220 18:08:38.687249 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:38 crc kubenswrapper[4697]: I0220 18:08:38.736058 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7c429"] Feb 20 18:08:40 crc kubenswrapper[4697]: I0220 18:08:40.655272 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7c429" podUID="62a9220c-6e88-4ad3-a683-c6024165c63d" containerName="registry-server" containerID="cri-o://ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b" gracePeriod=2 Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.200861 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.333144 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a9220c-6e88-4ad3-a683-c6024165c63d-catalog-content\") pod \"62a9220c-6e88-4ad3-a683-c6024165c63d\" (UID: \"62a9220c-6e88-4ad3-a683-c6024165c63d\") " Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.333417 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts8hx\" (UniqueName: \"kubernetes.io/projected/62a9220c-6e88-4ad3-a683-c6024165c63d-kube-api-access-ts8hx\") pod \"62a9220c-6e88-4ad3-a683-c6024165c63d\" (UID: \"62a9220c-6e88-4ad3-a683-c6024165c63d\") " Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.333518 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a9220c-6e88-4ad3-a683-c6024165c63d-utilities\") pod \"62a9220c-6e88-4ad3-a683-c6024165c63d\" (UID: \"62a9220c-6e88-4ad3-a683-c6024165c63d\") " Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.334655 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62a9220c-6e88-4ad3-a683-c6024165c63d-utilities" (OuterVolumeSpecName: "utilities") pod "62a9220c-6e88-4ad3-a683-c6024165c63d" (UID: "62a9220c-6e88-4ad3-a683-c6024165c63d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.341938 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a9220c-6e88-4ad3-a683-c6024165c63d-kube-api-access-ts8hx" (OuterVolumeSpecName: "kube-api-access-ts8hx") pod "62a9220c-6e88-4ad3-a683-c6024165c63d" (UID: "62a9220c-6e88-4ad3-a683-c6024165c63d"). InnerVolumeSpecName "kube-api-access-ts8hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.371385 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62a9220c-6e88-4ad3-a683-c6024165c63d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62a9220c-6e88-4ad3-a683-c6024165c63d" (UID: "62a9220c-6e88-4ad3-a683-c6024165c63d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.435875 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a9220c-6e88-4ad3-a683-c6024165c63d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.435917 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts8hx\" (UniqueName: \"kubernetes.io/projected/62a9220c-6e88-4ad3-a683-c6024165c63d-kube-api-access-ts8hx\") on node \"crc\" DevicePath \"\"" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.435933 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a9220c-6e88-4ad3-a683-c6024165c63d-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.668521 4697 generic.go:334] "Generic (PLEG): container finished" podID="62a9220c-6e88-4ad3-a683-c6024165c63d" containerID="ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b" exitCode=0 Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.668574 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7c429" event={"ID":"62a9220c-6e88-4ad3-a683-c6024165c63d","Type":"ContainerDied","Data":"ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b"} Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.668609 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7c429" event={"ID":"62a9220c-6e88-4ad3-a683-c6024165c63d","Type":"ContainerDied","Data":"a234e81175ec54b9540b9171aecc28031dbfac67424729209bf04937006276f0"} Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.668635 4697 scope.go:117] "RemoveContainer" containerID="ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.668640 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7c429" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.700286 4697 scope.go:117] "RemoveContainer" containerID="a9dba87e65b4c8469acee646aed22c8c9ccf04d047fbfa36e5b040791ba47da3" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.723689 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7c429"] Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.731377 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7c429"] Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.742751 4697 scope.go:117] "RemoveContainer" containerID="896cc48187b7dc286e94d859834c51ef6f2aab3bd6b37e964c356899cbb58101" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.795549 4697 scope.go:117] "RemoveContainer" containerID="ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b" Feb 20 18:08:41 crc kubenswrapper[4697]: E0220 18:08:41.796251 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b\": container with ID starting with ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b not found: ID does not exist" containerID="ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.796292 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b"} err="failed to get container status \"ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b\": rpc error: code = NotFound desc = could not find container \"ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b\": container with ID starting with ecdab8864186829207d30aac4596aa63960cabf3aeb8926a2e66b894a7ff5d8b not found: ID does not exist" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.796319 4697 scope.go:117] "RemoveContainer" containerID="a9dba87e65b4c8469acee646aed22c8c9ccf04d047fbfa36e5b040791ba47da3" Feb 20 18:08:41 crc kubenswrapper[4697]: E0220 18:08:41.796788 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9dba87e65b4c8469acee646aed22c8c9ccf04d047fbfa36e5b040791ba47da3\": container with ID starting with a9dba87e65b4c8469acee646aed22c8c9ccf04d047fbfa36e5b040791ba47da3 not found: ID does not exist" containerID="a9dba87e65b4c8469acee646aed22c8c9ccf04d047fbfa36e5b040791ba47da3" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.796817 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9dba87e65b4c8469acee646aed22c8c9ccf04d047fbfa36e5b040791ba47da3"} err="failed to get container status \"a9dba87e65b4c8469acee646aed22c8c9ccf04d047fbfa36e5b040791ba47da3\": rpc error: code = NotFound desc = could not find container \"a9dba87e65b4c8469acee646aed22c8c9ccf04d047fbfa36e5b040791ba47da3\": container with ID starting with a9dba87e65b4c8469acee646aed22c8c9ccf04d047fbfa36e5b040791ba47da3 not found: ID does not exist" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.796835 4697 scope.go:117] "RemoveContainer" containerID="896cc48187b7dc286e94d859834c51ef6f2aab3bd6b37e964c356899cbb58101" Feb 20 18:08:41 crc kubenswrapper[4697]: E0220 18:08:41.797187 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"896cc48187b7dc286e94d859834c51ef6f2aab3bd6b37e964c356899cbb58101\": container with ID starting with 896cc48187b7dc286e94d859834c51ef6f2aab3bd6b37e964c356899cbb58101 not found: ID does not exist" containerID="896cc48187b7dc286e94d859834c51ef6f2aab3bd6b37e964c356899cbb58101" Feb 20 18:08:41 crc kubenswrapper[4697]: I0220 18:08:41.797217 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896cc48187b7dc286e94d859834c51ef6f2aab3bd6b37e964c356899cbb58101"} err="failed to get container status \"896cc48187b7dc286e94d859834c51ef6f2aab3bd6b37e964c356899cbb58101\": rpc error: code = NotFound desc = could not find container \"896cc48187b7dc286e94d859834c51ef6f2aab3bd6b37e964c356899cbb58101\": container with ID starting with 896cc48187b7dc286e94d859834c51ef6f2aab3bd6b37e964c356899cbb58101 not found: ID does not exist" Feb 20 18:08:42 crc kubenswrapper[4697]: I0220 18:08:42.890248 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62a9220c-6e88-4ad3-a683-c6024165c63d" path="/var/lib/kubelet/pods/62a9220c-6e88-4ad3-a683-c6024165c63d/volumes" Feb 20 18:08:46 crc kubenswrapper[4697]: I0220 18:08:46.877287 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:08:46 crc kubenswrapper[4697]: E0220 18:08:46.877845 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:08:58 crc kubenswrapper[4697]: I0220 18:08:58.877051 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:08:58 crc kubenswrapper[4697]: E0220 18:08:58.878055 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:09:13 crc kubenswrapper[4697]: I0220 18:09:13.877953 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:09:13 crc kubenswrapper[4697]: E0220 18:09:13.879047 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:09:26 crc kubenswrapper[4697]: I0220 18:09:26.877177 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:09:26 crc kubenswrapper[4697]: E0220 18:09:26.877857 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:09:41 crc kubenswrapper[4697]: I0220 18:09:41.877636 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:09:41 crc kubenswrapper[4697]: E0220 18:09:41.878635 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:09:55 crc kubenswrapper[4697]: I0220 18:09:55.877319 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:09:55 crc kubenswrapper[4697]: E0220 18:09:55.878140 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:10:07 crc kubenswrapper[4697]: I0220 18:10:07.877925 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:10:07 crc kubenswrapper[4697]: E0220 18:10:07.878706 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.645155 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nbm5z/must-gather-xnv6s"] Feb 20 18:10:15 crc kubenswrapper[4697]: E0220 18:10:15.646127 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a9220c-6e88-4ad3-a683-c6024165c63d" containerName="registry-server" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.646144 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a9220c-6e88-4ad3-a683-c6024165c63d" containerName="registry-server" Feb 20 18:10:15 crc kubenswrapper[4697]: E0220 18:10:15.646172 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a9220c-6e88-4ad3-a683-c6024165c63d" containerName="extract-utilities" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.646180 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a9220c-6e88-4ad3-a683-c6024165c63d" containerName="extract-utilities" Feb 20 18:10:15 crc kubenswrapper[4697]: E0220 18:10:15.646197 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a9220c-6e88-4ad3-a683-c6024165c63d" containerName="extract-content" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.646203 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a9220c-6e88-4ad3-a683-c6024165c63d" containerName="extract-content" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.646403 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a9220c-6e88-4ad3-a683-c6024165c63d" containerName="registry-server" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.647565 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/must-gather-xnv6s" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.655093 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nbm5z"/"openshift-service-ca.crt" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.658480 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nbm5z"/"kube-root-ca.crt" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.665197 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nbm5z/must-gather-xnv6s"] Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.678785 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkgg6\" (UniqueName: \"kubernetes.io/projected/d852f86a-ca7e-49f4-b223-af8574601e18-kube-api-access-lkgg6\") pod \"must-gather-xnv6s\" (UID: \"d852f86a-ca7e-49f4-b223-af8574601e18\") " pod="openshift-must-gather-nbm5z/must-gather-xnv6s" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.678862 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d852f86a-ca7e-49f4-b223-af8574601e18-must-gather-output\") pod \"must-gather-xnv6s\" (UID: \"d852f86a-ca7e-49f4-b223-af8574601e18\") " pod="openshift-must-gather-nbm5z/must-gather-xnv6s" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.790684 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkgg6\" (UniqueName: \"kubernetes.io/projected/d852f86a-ca7e-49f4-b223-af8574601e18-kube-api-access-lkgg6\") pod \"must-gather-xnv6s\" (UID: \"d852f86a-ca7e-49f4-b223-af8574601e18\") " pod="openshift-must-gather-nbm5z/must-gather-xnv6s" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.790736 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d852f86a-ca7e-49f4-b223-af8574601e18-must-gather-output\") pod \"must-gather-xnv6s\" (UID: \"d852f86a-ca7e-49f4-b223-af8574601e18\") " pod="openshift-must-gather-nbm5z/must-gather-xnv6s" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.791255 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d852f86a-ca7e-49f4-b223-af8574601e18-must-gather-output\") pod \"must-gather-xnv6s\" (UID: \"d852f86a-ca7e-49f4-b223-af8574601e18\") " pod="openshift-must-gather-nbm5z/must-gather-xnv6s" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.824310 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkgg6\" (UniqueName: \"kubernetes.io/projected/d852f86a-ca7e-49f4-b223-af8574601e18-kube-api-access-lkgg6\") pod \"must-gather-xnv6s\" (UID: \"d852f86a-ca7e-49f4-b223-af8574601e18\") " pod="openshift-must-gather-nbm5z/must-gather-xnv6s" Feb 20 18:10:15 crc kubenswrapper[4697]: I0220 18:10:15.971073 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/must-gather-xnv6s" Feb 20 18:10:16 crc kubenswrapper[4697]: I0220 18:10:16.723102 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nbm5z/must-gather-xnv6s"] Feb 20 18:10:17 crc kubenswrapper[4697]: I0220 18:10:17.097351 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nbm5z/must-gather-xnv6s" event={"ID":"d852f86a-ca7e-49f4-b223-af8574601e18","Type":"ContainerStarted","Data":"3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20"} Feb 20 18:10:17 crc kubenswrapper[4697]: I0220 18:10:17.097759 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nbm5z/must-gather-xnv6s" event={"ID":"d852f86a-ca7e-49f4-b223-af8574601e18","Type":"ContainerStarted","Data":"87f9747e3706596766261215a057968b723ffe79a85ca1c97e867da5a805b192"} Feb 20 18:10:18 crc kubenswrapper[4697]: I0220 18:10:18.107123 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nbm5z/must-gather-xnv6s" event={"ID":"d852f86a-ca7e-49f4-b223-af8574601e18","Type":"ContainerStarted","Data":"3bac8c082aa836d42d710220d2eea3c4c71d1485df8d1a65d7aa4cdea6970f95"} Feb 20 18:10:18 crc kubenswrapper[4697]: I0220 18:10:18.122633 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nbm5z/must-gather-xnv6s" podStartSLOduration=3.122617573 podStartE2EDuration="3.122617573s" podCreationTimestamp="2026-02-20 18:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 18:10:18.122272504 +0000 UTC m=+5925.902317932" watchObservedRunningTime="2026-02-20 18:10:18.122617573 +0000 UTC m=+5925.902662981" Feb 20 18:10:18 crc kubenswrapper[4697]: I0220 18:10:18.878447 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:10:18 crc kubenswrapper[4697]: E0220 18:10:18.878824 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:10:20 crc kubenswrapper[4697]: I0220 18:10:20.996407 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nbm5z/crc-debug-q7xkk"] Feb 20 18:10:20 crc kubenswrapper[4697]: I0220 18:10:20.998868 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" Feb 20 18:10:21 crc kubenswrapper[4697]: I0220 18:10:21.001099 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-nbm5z"/"default-dockercfg-krqct" Feb 20 18:10:21 crc kubenswrapper[4697]: I0220 18:10:21.135716 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4-host\") pod \"crc-debug-q7xkk\" (UID: \"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4\") " pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" Feb 20 18:10:21 crc kubenswrapper[4697]: I0220 18:10:21.136152 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgb8m\" (UniqueName: \"kubernetes.io/projected/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4-kube-api-access-xgb8m\") pod \"crc-debug-q7xkk\" (UID: \"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4\") " pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" Feb 20 18:10:21 crc kubenswrapper[4697]: I0220 18:10:21.237791 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgb8m\" (UniqueName: \"kubernetes.io/projected/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4-kube-api-access-xgb8m\") pod \"crc-debug-q7xkk\" (UID: \"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4\") " pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" Feb 20 18:10:21 crc kubenswrapper[4697]: I0220 18:10:21.237926 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4-host\") pod \"crc-debug-q7xkk\" (UID: \"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4\") " pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" Feb 20 18:10:21 crc kubenswrapper[4697]: I0220 18:10:21.238095 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4-host\") pod \"crc-debug-q7xkk\" (UID: \"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4\") " pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" Feb 20 18:10:21 crc kubenswrapper[4697]: I0220 18:10:21.264164 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgb8m\" (UniqueName: \"kubernetes.io/projected/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4-kube-api-access-xgb8m\") pod \"crc-debug-q7xkk\" (UID: \"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4\") " pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" Feb 20 18:10:21 crc kubenswrapper[4697]: I0220 18:10:21.317732 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" Feb 20 18:10:22 crc kubenswrapper[4697]: I0220 18:10:22.164184 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" event={"ID":"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4","Type":"ContainerStarted","Data":"7fd39bd85f983d77c9ba58e0b051a3139e2e2df6c804f739e2dec7da4ad79694"} Feb 20 18:10:22 crc kubenswrapper[4697]: I0220 18:10:22.164680 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" event={"ID":"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4","Type":"ContainerStarted","Data":"7b75e4d60542f4b14fa1df84ea75954e8c60c57e4f6348180494974e0b036391"} Feb 20 18:10:22 crc kubenswrapper[4697]: I0220 18:10:22.194575 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" podStartSLOduration=2.194487512 podStartE2EDuration="2.194487512s" podCreationTimestamp="2026-02-20 18:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 18:10:22.179855863 +0000 UTC m=+5929.959901271" watchObservedRunningTime="2026-02-20 18:10:22.194487512 +0000 UTC m=+5929.974532910" Feb 20 18:10:31 crc kubenswrapper[4697]: I0220 18:10:31.878809 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:10:31 crc kubenswrapper[4697]: E0220 18:10:31.879575 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:10:45 crc kubenswrapper[4697]: I0220 18:10:45.877063 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:10:45 crc kubenswrapper[4697]: E0220 18:10:45.877846 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:10:59 crc kubenswrapper[4697]: I0220 18:10:59.877022 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:10:59 crc kubenswrapper[4697]: E0220 18:10:59.877759 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:11:01 crc kubenswrapper[4697]: I0220 18:11:01.840396 4697 generic.go:334] "Generic (PLEG): container finished" podID="ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4" containerID="7fd39bd85f983d77c9ba58e0b051a3139e2e2df6c804f739e2dec7da4ad79694" exitCode=0 Feb 20 18:11:01 crc kubenswrapper[4697]: I0220 18:11:01.840473 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" event={"ID":"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4","Type":"ContainerDied","Data":"7fd39bd85f983d77c9ba58e0b051a3139e2e2df6c804f739e2dec7da4ad79694"} Feb 20 18:11:02 crc kubenswrapper[4697]: I0220 18:11:02.966674 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" Feb 20 18:11:03 crc kubenswrapper[4697]: I0220 18:11:03.005158 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nbm5z/crc-debug-q7xkk"] Feb 20 18:11:03 crc kubenswrapper[4697]: I0220 18:11:03.014998 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nbm5z/crc-debug-q7xkk"] Feb 20 18:11:03 crc kubenswrapper[4697]: I0220 18:11:03.130098 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4-host\") pod \"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4\" (UID: \"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4\") " Feb 20 18:11:03 crc kubenswrapper[4697]: I0220 18:11:03.130217 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgb8m\" (UniqueName: \"kubernetes.io/projected/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4-kube-api-access-xgb8m\") pod \"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4\" (UID: \"ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4\") " Feb 20 18:11:03 crc kubenswrapper[4697]: I0220 18:11:03.130254 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4-host" (OuterVolumeSpecName: "host") pod "ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4" (UID: "ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 18:11:03 crc kubenswrapper[4697]: I0220 18:11:03.130945 4697 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4-host\") on node \"crc\" DevicePath \"\"" Feb 20 18:11:03 crc kubenswrapper[4697]: I0220 18:11:03.135573 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4-kube-api-access-xgb8m" (OuterVolumeSpecName: "kube-api-access-xgb8m") pod "ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4" (UID: "ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4"). InnerVolumeSpecName "kube-api-access-xgb8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:11:03 crc kubenswrapper[4697]: I0220 18:11:03.232770 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgb8m\" (UniqueName: \"kubernetes.io/projected/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4-kube-api-access-xgb8m\") on node \"crc\" DevicePath \"\"" Feb 20 18:11:03 crc kubenswrapper[4697]: I0220 18:11:03.862392 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b75e4d60542f4b14fa1df84ea75954e8c60c57e4f6348180494974e0b036391" Feb 20 18:11:03 crc kubenswrapper[4697]: I0220 18:11:03.862599 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/crc-debug-q7xkk" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.244556 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nbm5z/crc-debug-2pb62"] Feb 20 18:11:04 crc kubenswrapper[4697]: E0220 18:11:04.244952 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4" containerName="container-00" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.244964 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4" containerName="container-00" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.245170 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4" containerName="container-00" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.245959 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/crc-debug-2pb62" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.248042 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-nbm5z"/"default-dockercfg-krqct" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.352878 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fd82e66-968b-4df8-89ac-145803bfba18-host\") pod \"crc-debug-2pb62\" (UID: \"1fd82e66-968b-4df8-89ac-145803bfba18\") " pod="openshift-must-gather-nbm5z/crc-debug-2pb62" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.353273 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mncb\" (UniqueName: \"kubernetes.io/projected/1fd82e66-968b-4df8-89ac-145803bfba18-kube-api-access-9mncb\") pod \"crc-debug-2pb62\" (UID: \"1fd82e66-968b-4df8-89ac-145803bfba18\") " pod="openshift-must-gather-nbm5z/crc-debug-2pb62" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.455557 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fd82e66-968b-4df8-89ac-145803bfba18-host\") pod \"crc-debug-2pb62\" (UID: \"1fd82e66-968b-4df8-89ac-145803bfba18\") " pod="openshift-must-gather-nbm5z/crc-debug-2pb62" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.455708 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mncb\" (UniqueName: \"kubernetes.io/projected/1fd82e66-968b-4df8-89ac-145803bfba18-kube-api-access-9mncb\") pod \"crc-debug-2pb62\" (UID: \"1fd82e66-968b-4df8-89ac-145803bfba18\") " pod="openshift-must-gather-nbm5z/crc-debug-2pb62" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.455720 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fd82e66-968b-4df8-89ac-145803bfba18-host\") pod \"crc-debug-2pb62\" (UID: \"1fd82e66-968b-4df8-89ac-145803bfba18\") " pod="openshift-must-gather-nbm5z/crc-debug-2pb62" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.473006 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mncb\" (UniqueName: \"kubernetes.io/projected/1fd82e66-968b-4df8-89ac-145803bfba18-kube-api-access-9mncb\") pod \"crc-debug-2pb62\" (UID: \"1fd82e66-968b-4df8-89ac-145803bfba18\") " pod="openshift-must-gather-nbm5z/crc-debug-2pb62" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.562034 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/crc-debug-2pb62" Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.872013 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nbm5z/crc-debug-2pb62" event={"ID":"1fd82e66-968b-4df8-89ac-145803bfba18","Type":"ContainerStarted","Data":"fa9392e4d66ff8d2fc87ed0cacf0b86410a85f6b021e5075f28b43a53b3ac839"} Feb 20 18:11:04 crc kubenswrapper[4697]: I0220 18:11:04.887822 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4" path="/var/lib/kubelet/pods/ad2a16d8-a4ac-47a5-9d94-c55c0f021ec4/volumes" Feb 20 18:11:05 crc kubenswrapper[4697]: I0220 18:11:05.883529 4697 generic.go:334] "Generic (PLEG): container finished" podID="1fd82e66-968b-4df8-89ac-145803bfba18" containerID="53f53372d0f635c721224c8980f289d54365190679d36095c59571e92d96f2ca" exitCode=0 Feb 20 18:11:05 crc kubenswrapper[4697]: I0220 18:11:05.884813 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nbm5z/crc-debug-2pb62" event={"ID":"1fd82e66-968b-4df8-89ac-145803bfba18","Type":"ContainerDied","Data":"53f53372d0f635c721224c8980f289d54365190679d36095c59571e92d96f2ca"} Feb 20 18:11:06 crc kubenswrapper[4697]: I0220 18:11:06.990836 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/crc-debug-2pb62" Feb 20 18:11:07 crc kubenswrapper[4697]: I0220 18:11:07.100821 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fd82e66-968b-4df8-89ac-145803bfba18-host\") pod \"1fd82e66-968b-4df8-89ac-145803bfba18\" (UID: \"1fd82e66-968b-4df8-89ac-145803bfba18\") " Feb 20 18:11:07 crc kubenswrapper[4697]: I0220 18:11:07.101084 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mncb\" (UniqueName: \"kubernetes.io/projected/1fd82e66-968b-4df8-89ac-145803bfba18-kube-api-access-9mncb\") pod \"1fd82e66-968b-4df8-89ac-145803bfba18\" (UID: \"1fd82e66-968b-4df8-89ac-145803bfba18\") " Feb 20 18:11:07 crc kubenswrapper[4697]: I0220 18:11:07.100927 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fd82e66-968b-4df8-89ac-145803bfba18-host" (OuterVolumeSpecName: "host") pod "1fd82e66-968b-4df8-89ac-145803bfba18" (UID: "1fd82e66-968b-4df8-89ac-145803bfba18"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 18:11:07 crc kubenswrapper[4697]: I0220 18:11:07.101752 4697 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fd82e66-968b-4df8-89ac-145803bfba18-host\") on node \"crc\" DevicePath \"\"" Feb 20 18:11:07 crc kubenswrapper[4697]: I0220 18:11:07.107616 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd82e66-968b-4df8-89ac-145803bfba18-kube-api-access-9mncb" (OuterVolumeSpecName: "kube-api-access-9mncb") pod "1fd82e66-968b-4df8-89ac-145803bfba18" (UID: "1fd82e66-968b-4df8-89ac-145803bfba18"). InnerVolumeSpecName "kube-api-access-9mncb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:11:07 crc kubenswrapper[4697]: I0220 18:11:07.206693 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mncb\" (UniqueName: \"kubernetes.io/projected/1fd82e66-968b-4df8-89ac-145803bfba18-kube-api-access-9mncb\") on node \"crc\" DevicePath \"\"" Feb 20 18:11:07 crc kubenswrapper[4697]: I0220 18:11:07.901156 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nbm5z/crc-debug-2pb62" event={"ID":"1fd82e66-968b-4df8-89ac-145803bfba18","Type":"ContainerDied","Data":"fa9392e4d66ff8d2fc87ed0cacf0b86410a85f6b021e5075f28b43a53b3ac839"} Feb 20 18:11:07 crc kubenswrapper[4697]: I0220 18:11:07.901192 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa9392e4d66ff8d2fc87ed0cacf0b86410a85f6b021e5075f28b43a53b3ac839" Feb 20 18:11:07 crc kubenswrapper[4697]: I0220 18:11:07.901239 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/crc-debug-2pb62" Feb 20 18:11:07 crc kubenswrapper[4697]: I0220 18:11:07.991923 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nbm5z/crc-debug-2pb62"] Feb 20 18:11:08 crc kubenswrapper[4697]: I0220 18:11:08.000913 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nbm5z/crc-debug-2pb62"] Feb 20 18:11:08 crc kubenswrapper[4697]: I0220 18:11:08.887682 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fd82e66-968b-4df8-89ac-145803bfba18" path="/var/lib/kubelet/pods/1fd82e66-968b-4df8-89ac-145803bfba18/volumes" Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.202826 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nbm5z/crc-debug-c5vmw"] Feb 20 18:11:09 crc kubenswrapper[4697]: E0220 18:11:09.203367 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd82e66-968b-4df8-89ac-145803bfba18" containerName="container-00" Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.203383 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd82e66-968b-4df8-89ac-145803bfba18" containerName="container-00" Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.203583 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd82e66-968b-4df8-89ac-145803bfba18" containerName="container-00" Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.204269 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/crc-debug-c5vmw" Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.206108 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-nbm5z"/"default-dockercfg-krqct" Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.346641 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2dcb2d1-eb53-4971-94f0-9f074403942d-host\") pod \"crc-debug-c5vmw\" (UID: \"f2dcb2d1-eb53-4971-94f0-9f074403942d\") " pod="openshift-must-gather-nbm5z/crc-debug-c5vmw" Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.346692 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnwzr\" (UniqueName: \"kubernetes.io/projected/f2dcb2d1-eb53-4971-94f0-9f074403942d-kube-api-access-fnwzr\") pod \"crc-debug-c5vmw\" (UID: \"f2dcb2d1-eb53-4971-94f0-9f074403942d\") " pod="openshift-must-gather-nbm5z/crc-debug-c5vmw" Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.448516 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnwzr\" (UniqueName: \"kubernetes.io/projected/f2dcb2d1-eb53-4971-94f0-9f074403942d-kube-api-access-fnwzr\") pod \"crc-debug-c5vmw\" (UID: \"f2dcb2d1-eb53-4971-94f0-9f074403942d\") " pod="openshift-must-gather-nbm5z/crc-debug-c5vmw" Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.448948 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2dcb2d1-eb53-4971-94f0-9f074403942d-host\") pod \"crc-debug-c5vmw\" (UID: \"f2dcb2d1-eb53-4971-94f0-9f074403942d\") " pod="openshift-must-gather-nbm5z/crc-debug-c5vmw" Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.449055 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2dcb2d1-eb53-4971-94f0-9f074403942d-host\") pod \"crc-debug-c5vmw\" (UID: \"f2dcb2d1-eb53-4971-94f0-9f074403942d\") " pod="openshift-must-gather-nbm5z/crc-debug-c5vmw" Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.469250 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnwzr\" (UniqueName: \"kubernetes.io/projected/f2dcb2d1-eb53-4971-94f0-9f074403942d-kube-api-access-fnwzr\") pod \"crc-debug-c5vmw\" (UID: \"f2dcb2d1-eb53-4971-94f0-9f074403942d\") " pod="openshift-must-gather-nbm5z/crc-debug-c5vmw" Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.520268 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/crc-debug-c5vmw" Feb 20 18:11:09 crc kubenswrapper[4697]: W0220 18:11:09.561773 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2dcb2d1_eb53_4971_94f0_9f074403942d.slice/crio-01cb4550ecfccd1e27f53c2283ca6cccc5ec080be3af88d3e6201e955ff714d2 WatchSource:0}: Error finding container 01cb4550ecfccd1e27f53c2283ca6cccc5ec080be3af88d3e6201e955ff714d2: Status 404 returned error can't find the container with id 01cb4550ecfccd1e27f53c2283ca6cccc5ec080be3af88d3e6201e955ff714d2 Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.918517 4697 generic.go:334] "Generic (PLEG): container finished" podID="f2dcb2d1-eb53-4971-94f0-9f074403942d" containerID="753ffa48584aecc765d8732307b718b409d0d0aa7e1664e3a21e9a1ea2547b73" exitCode=0 Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.918622 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nbm5z/crc-debug-c5vmw" event={"ID":"f2dcb2d1-eb53-4971-94f0-9f074403942d","Type":"ContainerDied","Data":"753ffa48584aecc765d8732307b718b409d0d0aa7e1664e3a21e9a1ea2547b73"} Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.918935 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nbm5z/crc-debug-c5vmw" event={"ID":"f2dcb2d1-eb53-4971-94f0-9f074403942d","Type":"ContainerStarted","Data":"01cb4550ecfccd1e27f53c2283ca6cccc5ec080be3af88d3e6201e955ff714d2"} Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.954551 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nbm5z/crc-debug-c5vmw"] Feb 20 18:11:09 crc kubenswrapper[4697]: I0220 18:11:09.964234 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nbm5z/crc-debug-c5vmw"] Feb 20 18:11:11 crc kubenswrapper[4697]: I0220 18:11:11.048098 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/crc-debug-c5vmw" Feb 20 18:11:11 crc kubenswrapper[4697]: I0220 18:11:11.184128 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnwzr\" (UniqueName: \"kubernetes.io/projected/f2dcb2d1-eb53-4971-94f0-9f074403942d-kube-api-access-fnwzr\") pod \"f2dcb2d1-eb53-4971-94f0-9f074403942d\" (UID: \"f2dcb2d1-eb53-4971-94f0-9f074403942d\") " Feb 20 18:11:11 crc kubenswrapper[4697]: I0220 18:11:11.184226 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2dcb2d1-eb53-4971-94f0-9f074403942d-host\") pod \"f2dcb2d1-eb53-4971-94f0-9f074403942d\" (UID: \"f2dcb2d1-eb53-4971-94f0-9f074403942d\") " Feb 20 18:11:11 crc kubenswrapper[4697]: I0220 18:11:11.184306 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2dcb2d1-eb53-4971-94f0-9f074403942d-host" (OuterVolumeSpecName: "host") pod "f2dcb2d1-eb53-4971-94f0-9f074403942d" (UID: "f2dcb2d1-eb53-4971-94f0-9f074403942d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 18:11:11 crc kubenswrapper[4697]: I0220 18:11:11.184791 4697 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2dcb2d1-eb53-4971-94f0-9f074403942d-host\") on node \"crc\" DevicePath \"\"" Feb 20 18:11:11 crc kubenswrapper[4697]: I0220 18:11:11.189781 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2dcb2d1-eb53-4971-94f0-9f074403942d-kube-api-access-fnwzr" (OuterVolumeSpecName: "kube-api-access-fnwzr") pod "f2dcb2d1-eb53-4971-94f0-9f074403942d" (UID: "f2dcb2d1-eb53-4971-94f0-9f074403942d"). InnerVolumeSpecName "kube-api-access-fnwzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:11:11 crc kubenswrapper[4697]: I0220 18:11:11.286650 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnwzr\" (UniqueName: \"kubernetes.io/projected/f2dcb2d1-eb53-4971-94f0-9f074403942d-kube-api-access-fnwzr\") on node \"crc\" DevicePath \"\"" Feb 20 18:11:11 crc kubenswrapper[4697]: I0220 18:11:11.936459 4697 scope.go:117] "RemoveContainer" containerID="753ffa48584aecc765d8732307b718b409d0d0aa7e1664e3a21e9a1ea2547b73" Feb 20 18:11:11 crc kubenswrapper[4697]: I0220 18:11:11.936497 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/crc-debug-c5vmw" Feb 20 18:11:12 crc kubenswrapper[4697]: I0220 18:11:12.884695 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:11:12 crc kubenswrapper[4697]: I0220 18:11:12.890959 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2dcb2d1-eb53-4971-94f0-9f074403942d" path="/var/lib/kubelet/pods/f2dcb2d1-eb53-4971-94f0-9f074403942d/volumes" Feb 20 18:11:13 crc kubenswrapper[4697]: I0220 18:11:13.969597 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"2207295b9ca3dd56c27a6f62d8877531cf202677384869ec78ba90d1c35300e6"} Feb 20 18:11:52 crc kubenswrapper[4697]: I0220 18:11:52.701368 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fddb45f9b-25kb9_5fcf8d33-fdb6-43e6-aade-d7dc55b3848c/barbican-api/0.log" Feb 20 18:11:52 crc kubenswrapper[4697]: I0220 18:11:52.838776 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fddb45f9b-25kb9_5fcf8d33-fdb6-43e6-aade-d7dc55b3848c/barbican-api-log/0.log" Feb 20 18:11:52 crc kubenswrapper[4697]: I0220 18:11:52.922876 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-674dd9ffc6-spdfl_71a69046-b0f7-4c26-a941-aba4a9475d0a/barbican-keystone-listener/0.log" Feb 20 18:11:53 crc kubenswrapper[4697]: I0220 18:11:53.008645 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-674dd9ffc6-spdfl_71a69046-b0f7-4c26-a941-aba4a9475d0a/barbican-keystone-listener-log/0.log" Feb 20 18:11:53 crc kubenswrapper[4697]: I0220 18:11:53.112118 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8474b565df-7h82r_9dc44740-061d-4c3a-9164-735d6da2dcf7/barbican-worker/0.log" Feb 20 18:11:53 crc kubenswrapper[4697]: I0220 18:11:53.179103 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8474b565df-7h82r_9dc44740-061d-4c3a-9164-735d6da2dcf7/barbican-worker-log/0.log" Feb 20 18:11:53 crc kubenswrapper[4697]: I0220 18:11:53.294487 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zxwpj_3343ad5b-476c-4e27-a5f7-e7948d8eed62/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:11:53 crc kubenswrapper[4697]: I0220 18:11:53.514807 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a3336f2b-7449-4ad5-9696-7565e147beab/ceilometer-central-agent/0.log" Feb 20 18:11:53 crc kubenswrapper[4697]: I0220 18:11:53.517074 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a3336f2b-7449-4ad5-9696-7565e147beab/proxy-httpd/0.log" Feb 20 18:11:53 crc kubenswrapper[4697]: I0220 18:11:53.586950 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a3336f2b-7449-4ad5-9696-7565e147beab/ceilometer-notification-agent/0.log" Feb 20 18:11:53 crc kubenswrapper[4697]: I0220 18:11:53.616556 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a3336f2b-7449-4ad5-9696-7565e147beab/sg-core/0.log" Feb 20 18:11:53 crc kubenswrapper[4697]: I0220 18:11:53.823541 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e8dd164d-3a39-4c58-99a0-1766204765bf/cinder-api-log/0.log" Feb 20 18:11:54 crc kubenswrapper[4697]: I0220 18:11:54.162992 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e9192fe2-573e-4d32-915c-535887423540/probe/0.log" Feb 20 18:11:54 crc kubenswrapper[4697]: I0220 18:11:54.423223 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8a927bbf-7945-44bd-9c0c-1f24b0af5b9a/cinder-scheduler/0.log" Feb 20 18:11:54 crc kubenswrapper[4697]: I0220 18:11:54.427930 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8a927bbf-7945-44bd-9c0c-1f24b0af5b9a/probe/0.log" Feb 20 18:11:54 crc kubenswrapper[4697]: I0220 18:11:54.437681 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e9192fe2-573e-4d32-915c-535887423540/cinder-backup/0.log" Feb 20 18:11:54 crc kubenswrapper[4697]: I0220 18:11:54.464242 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e8dd164d-3a39-4c58-99a0-1766204765bf/cinder-api/0.log" Feb 20 18:11:54 crc kubenswrapper[4697]: I0220 18:11:54.669193 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_0ca8173d-4029-4a57-80c9-c63c05842bb5/probe/0.log" Feb 20 18:11:54 crc kubenswrapper[4697]: I0220 18:11:54.819246 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_0ca8173d-4029-4a57-80c9-c63c05842bb5/cinder-volume/0.log" Feb 20 18:11:54 crc kubenswrapper[4697]: I0220 18:11:54.907887 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_59ea396a-fa78-4a1a-95e5-48f3e4e49bda/probe/0.log" Feb 20 18:11:55 crc kubenswrapper[4697]: I0220 18:11:55.082393 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_59ea396a-fa78-4a1a-95e5-48f3e4e49bda/cinder-volume/0.log" Feb 20 18:11:55 crc kubenswrapper[4697]: I0220 18:11:55.084098 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7x7qt_a2756b57-da81-4893-85d4-119fe103b4de/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:11:55 crc kubenswrapper[4697]: I0220 18:11:55.170087 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7gj5f_904a709b-1b1b-46d9-b2cd-d6517ff7ef07/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:11:55 crc kubenswrapper[4697]: I0220 18:11:55.317126 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5955df7555-qdwfx_af760d37-5cb2-4378-811d-cf343b5c9faf/init/0.log" Feb 20 18:11:55 crc kubenswrapper[4697]: I0220 18:11:55.476943 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5955df7555-qdwfx_af760d37-5cb2-4378-811d-cf343b5c9faf/init/0.log" Feb 20 18:11:55 crc kubenswrapper[4697]: I0220 18:11:55.511173 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-ft9p6_671fae5d-08e8-4fac-ba16-e33a5a4f1f0b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:11:55 crc kubenswrapper[4697]: I0220 18:11:55.595686 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5955df7555-qdwfx_af760d37-5cb2-4378-811d-cf343b5c9faf/dnsmasq-dns/0.log" Feb 20 18:11:55 crc kubenswrapper[4697]: I0220 18:11:55.739397 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0/glance-httpd/0.log" Feb 20 18:11:55 crc kubenswrapper[4697]: I0220 18:11:55.789462 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b69dfa3b-7b4b-46b7-a3bd-88a26c4e62f0/glance-log/0.log" Feb 20 18:11:55 crc kubenswrapper[4697]: I0220 18:11:55.903970 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2eec23bf-874f-423a-8d8c-b3f20b494c87/glance-httpd/0.log" Feb 20 18:11:55 crc kubenswrapper[4697]: I0220 18:11:55.961524 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2eec23bf-874f-423a-8d8c-b3f20b494c87/glance-log/0.log" Feb 20 18:11:56 crc kubenswrapper[4697]: I0220 18:11:56.047082 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bc54df884-mx794_215f7a56-10a1-4ae5-9071-4983dbb45b35/horizon/0.log" Feb 20 18:11:56 crc kubenswrapper[4697]: I0220 18:11:56.347790 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-6gnqq_04c7e6bd-f464-42ea-aa0b-a4b47a169d6f/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:11:56 crc kubenswrapper[4697]: I0220 18:11:56.579997 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-6mlzs_0272cd28-d20c-4ffb-9dc0-dbaa2d92aae9/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:11:56 crc kubenswrapper[4697]: I0220 18:11:56.809591 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29526781-56g9l_fb8e173e-11e6-4bc4-a87e-58fd25b53076/keystone-cron/0.log" Feb 20 18:11:56 crc kubenswrapper[4697]: I0220 18:11:56.813790 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bc54df884-mx794_215f7a56-10a1-4ae5-9071-4983dbb45b35/horizon-log/0.log" Feb 20 18:11:56 crc kubenswrapper[4697]: I0220 18:11:56.926408 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29526841-57prn_594dde87-85f5-42d7-affc-466eb2311afc/keystone-cron/0.log" Feb 20 18:11:57 crc kubenswrapper[4697]: I0220 18:11:57.092577 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85d8b4ccc6-tdklm_9e1ff974-0f05-484c-b267-f1537ec9495e/keystone-api/0.log" Feb 20 18:11:57 crc kubenswrapper[4697]: I0220 18:11:57.106942 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_77b717c8-8b8d-4236-bd5e-95fb768f1f89/kube-state-metrics/0.log" Feb 20 18:11:57 crc kubenswrapper[4697]: I0220 18:11:57.244134 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-5hqt4_e6c6663b-45c5-4629-98fb-23de62292ee1/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:11:57 crc kubenswrapper[4697]: I0220 18:11:57.677305 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vwqcf_8190a9c0-1f92-4f97-8d67-04668a6920a2/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:11:57 crc kubenswrapper[4697]: I0220 18:11:57.712458 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76b77c89fc-t9rjg_0e0dd99f-4186-47a0-b6ba-c8d4abd040b0/neutron-httpd/0.log" Feb 20 18:11:57 crc kubenswrapper[4697]: I0220 18:11:57.802951 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76b77c89fc-t9rjg_0e0dd99f-4186-47a0-b6ba-c8d4abd040b0/neutron-api/0.log" Feb 20 18:11:57 crc kubenswrapper[4697]: I0220 18:11:57.898808 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_591f7e7d-78bf-43a5-afe2-119f93765311/setup-container/0.log" Feb 20 18:11:58 crc kubenswrapper[4697]: I0220 18:11:58.064472 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_591f7e7d-78bf-43a5-afe2-119f93765311/setup-container/0.log" Feb 20 18:11:58 crc kubenswrapper[4697]: I0220 18:11:58.140354 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_591f7e7d-78bf-43a5-afe2-119f93765311/rabbitmq/0.log" Feb 20 18:11:58 crc kubenswrapper[4697]: I0220 18:11:58.767604 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2a29103e-4075-486e-8107-34b4a75352cc/nova-cell0-conductor-conductor/0.log" Feb 20 18:11:59 crc kubenswrapper[4697]: I0220 18:11:59.049614 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a52da959-5d9a-4a66-8907-400b5bd0acfa/nova-cell1-conductor-conductor/0.log" Feb 20 18:11:59 crc kubenswrapper[4697]: I0220 18:11:59.539296 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_4b1ddcd6-abc4-467c-8cd1-1937c803e0b4/nova-cell1-novncproxy-novncproxy/0.log" Feb 20 18:11:59 crc kubenswrapper[4697]: I0220 18:11:59.615641 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-wzxdf_62242a65-ea27-495f-aa04-4a274f9e771a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:11:59 crc kubenswrapper[4697]: I0220 18:11:59.692789 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_947447ab-bffc-4330-9983-38789a4e8fcc/nova-api-log/0.log" Feb 20 18:11:59 crc kubenswrapper[4697]: I0220 18:11:59.895547 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5f42a083-98ed-403d-90ca-cc4ef5ba79d1/nova-metadata-log/0.log" Feb 20 18:12:00 crc kubenswrapper[4697]: I0220 18:12:00.304720 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_947447ab-bffc-4330-9983-38789a4e8fcc/nova-api-api/0.log" Feb 20 18:12:00 crc kubenswrapper[4697]: I0220 18:12:00.467145 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b33962a9-1867-4e1c-b597-d426ecf83e50/mysql-bootstrap/0.log" Feb 20 18:12:00 crc kubenswrapper[4697]: I0220 18:12:00.518917 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9bcd6e42-c512-4076-926f-b64c1da63a8a/nova-scheduler-scheduler/0.log" Feb 20 18:12:00 crc kubenswrapper[4697]: I0220 18:12:00.599960 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b33962a9-1867-4e1c-b597-d426ecf83e50/mysql-bootstrap/0.log" Feb 20 18:12:00 crc kubenswrapper[4697]: I0220 18:12:00.694023 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b33962a9-1867-4e1c-b597-d426ecf83e50/galera/0.log" Feb 20 18:12:00 crc kubenswrapper[4697]: I0220 18:12:00.791586 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33594c24-be5d-42de-ba91-5584becb21e3/mysql-bootstrap/0.log" Feb 20 18:12:01 crc kubenswrapper[4697]: I0220 18:12:01.020964 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33594c24-be5d-42de-ba91-5584becb21e3/mysql-bootstrap/0.log" Feb 20 18:12:01 crc kubenswrapper[4697]: I0220 18:12:01.082862 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33594c24-be5d-42de-ba91-5584becb21e3/galera/0.log" Feb 20 18:12:01 crc kubenswrapper[4697]: I0220 18:12:01.254351 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8dae4cc2-1fb9-47ff-af11-854c15a884a3/openstackclient/0.log" Feb 20 18:12:01 crc kubenswrapper[4697]: I0220 18:12:01.328760 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-62kkp_aa3e6a99-a9a4-4578-94c6-8a4b641405ec/ovn-controller/0.log" Feb 20 18:12:01 crc kubenswrapper[4697]: I0220 18:12:01.527484 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wrhqd_650a1475-c144-4a21-a156-9859cb1418d4/openstack-network-exporter/0.log" Feb 20 18:12:01 crc kubenswrapper[4697]: I0220 18:12:01.705510 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7wh5h_be3e14f5-1877-4618-87b6-b60623792988/ovsdb-server-init/0.log" Feb 20 18:12:01 crc kubenswrapper[4697]: I0220 18:12:01.861953 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7wh5h_be3e14f5-1877-4618-87b6-b60623792988/ovsdb-server-init/0.log" Feb 20 18:12:01 crc kubenswrapper[4697]: I0220 18:12:01.881702 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7wh5h_be3e14f5-1877-4618-87b6-b60623792988/ovsdb-server/0.log" Feb 20 18:12:02 crc kubenswrapper[4697]: I0220 18:12:02.124369 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-lc2h7_b1aebd55-3d79-403b-978d-04afedd25c3d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:12:02 crc kubenswrapper[4697]: I0220 18:12:02.230639 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5f42a083-98ed-403d-90ca-cc4ef5ba79d1/nova-metadata-metadata/0.log" Feb 20 18:12:02 crc kubenswrapper[4697]: I0220 18:12:02.273685 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7wh5h_be3e14f5-1877-4618-87b6-b60623792988/ovs-vswitchd/0.log" Feb 20 18:12:02 crc kubenswrapper[4697]: I0220 18:12:02.310838 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00989e93-9419-47e9-a3ba-7b5e65910be9/openstack-network-exporter/0.log" Feb 20 18:12:02 crc kubenswrapper[4697]: I0220 18:12:02.454056 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00989e93-9419-47e9-a3ba-7b5e65910be9/ovn-northd/0.log" Feb 20 18:12:02 crc kubenswrapper[4697]: I0220 18:12:02.493973 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fb5d6748-960a-41d3-a11e-6dd21c3dd46f/ovsdbserver-nb/0.log" Feb 20 18:12:02 crc kubenswrapper[4697]: I0220 18:12:02.513993 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fb5d6748-960a-41d3-a11e-6dd21c3dd46f/openstack-network-exporter/0.log" Feb 20 18:12:02 crc kubenswrapper[4697]: I0220 18:12:02.688252 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b179bb2c-61ea-4bca-860b-b419bb8d3341/openstack-network-exporter/0.log" Feb 20 18:12:02 crc kubenswrapper[4697]: I0220 18:12:02.729940 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b179bb2c-61ea-4bca-860b-b419bb8d3341/ovsdbserver-sb/0.log" Feb 20 18:12:02 crc kubenswrapper[4697]: I0220 18:12:02.966080 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2cb601d7-bad1-4085-9519-4cb9927fa531/init-config-reloader/0.log" Feb 20 18:12:03 crc kubenswrapper[4697]: I0220 18:12:03.156418 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57d8cdd7b4-pxpls_8cecab8a-e2db-476f-89d1-b28b4f585d57/placement-api/0.log" Feb 20 18:12:03 crc kubenswrapper[4697]: I0220 18:12:03.211201 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57d8cdd7b4-pxpls_8cecab8a-e2db-476f-89d1-b28b4f585d57/placement-log/0.log" Feb 20 18:12:03 crc kubenswrapper[4697]: I0220 18:12:03.235261 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2cb601d7-bad1-4085-9519-4cb9927fa531/init-config-reloader/0.log" Feb 20 18:12:03 crc kubenswrapper[4697]: I0220 18:12:03.273330 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2cb601d7-bad1-4085-9519-4cb9927fa531/config-reloader/0.log" Feb 20 18:12:03 crc kubenswrapper[4697]: I0220 18:12:03.386418 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2cb601d7-bad1-4085-9519-4cb9927fa531/prometheus/0.log" Feb 20 18:12:03 crc kubenswrapper[4697]: I0220 18:12:03.443196 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2cb601d7-bad1-4085-9519-4cb9927fa531/thanos-sidecar/0.log" Feb 20 18:12:03 crc kubenswrapper[4697]: I0220 18:12:03.516331 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_12a44ac2-6e80-4bac-9079-0b6637de700a/setup-container/0.log" Feb 20 18:12:03 crc kubenswrapper[4697]: I0220 18:12:03.719090 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9972c409-92f3-4ec7-9b59-cccd334b761e/setup-container/0.log" Feb 20 18:12:03 crc kubenswrapper[4697]: I0220 18:12:03.740009 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_12a44ac2-6e80-4bac-9079-0b6637de700a/rabbitmq/0.log" Feb 20 18:12:03 crc kubenswrapper[4697]: I0220 18:12:03.768571 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_12a44ac2-6e80-4bac-9079-0b6637de700a/setup-container/0.log" Feb 20 18:12:03 crc kubenswrapper[4697]: I0220 18:12:03.987685 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9972c409-92f3-4ec7-9b59-cccd334b761e/rabbitmq/0.log" Feb 20 18:12:04 crc kubenswrapper[4697]: I0220 18:12:04.021744 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9972c409-92f3-4ec7-9b59-cccd334b761e/setup-container/0.log" Feb 20 18:12:04 crc kubenswrapper[4697]: I0220 18:12:04.060906 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-b5qgk_eace25c0-d234-43c5-88a0-f8ba1fc78dac/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:12:04 crc kubenswrapper[4697]: I0220 18:12:04.278791 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-gp4g2_603c4295-1159-4a8b-856f-c40cb2a0838c/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:12:04 crc kubenswrapper[4697]: I0220 18:12:04.371543 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-lgqpf_143cf213-8274-47bd-b6f4-80f2d465275c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:12:04 crc kubenswrapper[4697]: I0220 18:12:04.500019 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7tfq6_0700858d-9b11-4cca-a80c-143da84eea6e/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:12:04 crc kubenswrapper[4697]: I0220 18:12:04.662999 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-57pvl_ab058f87-f768-4d3a-b7cf-a39feef5c5f6/ssh-known-hosts-edpm-deployment/0.log" Feb 20 18:12:04 crc kubenswrapper[4697]: I0220 18:12:04.825888 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d485f4f89-v5ctf_536e289e-762f-4f9f-8b58-027b09cf2609/proxy-server/0.log" Feb 20 18:12:04 crc kubenswrapper[4697]: I0220 18:12:04.912723 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-frqkk_a267ce98-60eb-4c3c-8906-de42f6872680/swift-ring-rebalance/0.log" Feb 20 18:12:04 crc kubenswrapper[4697]: I0220 18:12:04.927701 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d485f4f89-v5ctf_536e289e-762f-4f9f-8b58-027b09cf2609/proxy-httpd/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.131778 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/account-reaper/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.132122 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/account-auditor/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.147774 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/account-replicator/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.305620 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/account-server/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.322022 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/container-auditor/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.418422 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/container-server/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.422496 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/container-replicator/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.492268 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/container-updater/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.591702 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/object-auditor/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.651920 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/object-expirer/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.676140 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/object-replicator/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.742454 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/object-server/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.808922 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/object-updater/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.833526 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/rsync/0.log" Feb 20 18:12:05 crc kubenswrapper[4697]: I0220 18:12:05.906613 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8a8a227a-2c59-4ecd-a4c3-69c9018f1c13/swift-recon-cron/0.log" Feb 20 18:12:06 crc kubenswrapper[4697]: I0220 18:12:06.141749 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9st56_db0ef420-9810-4b2f-8f10-b3fb710293c6/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:12:06 crc kubenswrapper[4697]: I0220 18:12:06.184033 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e83acb01-3a91-4950-848f-d447679c0533/tempest-tests-tempest-tests-runner/0.log" Feb 20 18:12:06 crc kubenswrapper[4697]: I0220 18:12:06.274094 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_1b384505-09b4-453d-9418-b3116dfe429e/test-operator-logs-container/0.log" Feb 20 18:12:06 crc kubenswrapper[4697]: I0220 18:12:06.394132 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-cmwg8_0ed71fc6-e5c0-40fe-988e-04a30088f620/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 20 18:12:07 crc kubenswrapper[4697]: I0220 18:12:07.130296 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_14b3fb67-823a-4da5-a42d-27745717ba8b/watcher-applier/0.log" Feb 20 18:12:07 crc kubenswrapper[4697]: I0220 18:12:07.841393 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_04b1d7c5-4d8e-4962-b396-855adb2605d7/watcher-api-log/0.log" Feb 20 18:12:09 crc kubenswrapper[4697]: I0220 18:12:09.488329 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_19a9810e-52c4-4428-9fca-5bf65d100f50/memcached/0.log" Feb 20 18:12:10 crc kubenswrapper[4697]: I0220 18:12:10.233626 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_21f4378d-bb26-408d-9613-82246765639b/watcher-decision-engine/0.log" Feb 20 18:12:10 crc kubenswrapper[4697]: I0220 18:12:10.927424 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_04b1d7c5-4d8e-4962-b396-855adb2605d7/watcher-api/0.log" Feb 20 18:12:32 crc kubenswrapper[4697]: I0220 18:12:32.662375 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/util/0.log" Feb 20 18:12:32 crc kubenswrapper[4697]: I0220 18:12:32.891637 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/util/0.log" Feb 20 18:12:32 crc kubenswrapper[4697]: I0220 18:12:32.903059 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/pull/0.log" Feb 20 18:12:33 crc kubenswrapper[4697]: I0220 18:12:33.093611 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/pull/0.log" Feb 20 18:12:33 crc kubenswrapper[4697]: I0220 18:12:33.241502 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/util/0.log" Feb 20 18:12:33 crc kubenswrapper[4697]: I0220 18:12:33.278789 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/pull/0.log" Feb 20 18:12:33 crc kubenswrapper[4697]: I0220 18:12:33.441132 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d18946a00d986f01acf1f900340448866fce736a764248fc63e0f5fddbrqkqb_102f803e-192f-41ae-8742-2c9ba8ad7806/extract/0.log" Feb 20 18:12:33 crc kubenswrapper[4697]: I0220 18:12:33.618061 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-jxdb5_f56f8133-887c-456a-9cbf-6df7713789b3/manager/0.log" Feb 20 18:12:33 crc kubenswrapper[4697]: I0220 18:12:33.918045 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-5xkg5_71f3f3ad-1f6c-4d59-9fc8-b036014c1068/manager/0.log" Feb 20 18:12:34 crc kubenswrapper[4697]: I0220 18:12:34.136589 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-4nqmq_d233b891-dac3-4565-971b-85141828260d/manager/0.log" Feb 20 18:12:34 crc kubenswrapper[4697]: I0220 18:12:34.351565 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-qjfrm_b0605f71-51c5-49d9-8936-77affb7cf0bf/manager/0.log" Feb 20 18:12:34 crc kubenswrapper[4697]: I0220 18:12:34.784736 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-82v6j_d9926f6e-afca-48ad-8a52-fb7f53ba3dec/manager/0.log" Feb 20 18:12:34 crc kubenswrapper[4697]: I0220 18:12:34.958230 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-9zdp9_0b5f03ab-32bb-48ef-b7d7-1ede5fb51924/manager/0.log" Feb 20 18:12:35 crc kubenswrapper[4697]: I0220 18:12:35.169588 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-j9lmt_d30d696e-1555-4fc2-9316-c795de608048/manager/0.log" Feb 20 18:12:35 crc kubenswrapper[4697]: I0220 18:12:35.221077 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-rw4q5_4fdb933f-aa86-4b88-9b08-4783ce0f6e0c/manager/0.log" Feb 20 18:12:35 crc kubenswrapper[4697]: I0220 18:12:35.401443 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-dknf8_560f1df6-c03f-42ad-8175-5508f56e1ecc/manager/0.log" Feb 20 18:12:35 crc kubenswrapper[4697]: I0220 18:12:35.450391 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-gsmns_366fd13a-060b-4572-9541-dbf88a507588/manager/0.log" Feb 20 18:12:35 crc kubenswrapper[4697]: I0220 18:12:35.719676 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-ct4nk_21c4fbfe-19b7-4303-b4bf-72dbc90044dd/manager/0.log" Feb 20 18:12:35 crc kubenswrapper[4697]: I0220 18:12:35.936265 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-4mtzl_cae8d6b1-4649-40bb-b710-1197ac78db1b/manager/0.log" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.192139 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c7ptf8_3f3b7ed7-e806-4fa9-ac88-381c0b4bd237/manager/0.log" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.282801 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l25kc"] Feb 20 18:12:36 crc kubenswrapper[4697]: E0220 18:12:36.283271 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2dcb2d1-eb53-4971-94f0-9f074403942d" containerName="container-00" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.283287 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2dcb2d1-eb53-4971-94f0-9f074403942d" containerName="container-00" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.283505 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2dcb2d1-eb53-4971-94f0-9f074403942d" containerName="container-00" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.284948 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.297064 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l25kc"] Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.342290 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7de40c05-aabd-43fb-b655-f19b5e81937d-utilities\") pod \"community-operators-l25kc\" (UID: \"7de40c05-aabd-43fb-b655-f19b5e81937d\") " pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.342426 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7de40c05-aabd-43fb-b655-f19b5e81937d-catalog-content\") pod \"community-operators-l25kc\" (UID: \"7de40c05-aabd-43fb-b655-f19b5e81937d\") " pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.342531 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x88qz\" (UniqueName: \"kubernetes.io/projected/7de40c05-aabd-43fb-b655-f19b5e81937d-kube-api-access-x88qz\") pod \"community-operators-l25kc\" (UID: \"7de40c05-aabd-43fb-b655-f19b5e81937d\") " pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.443566 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7de40c05-aabd-43fb-b655-f19b5e81937d-catalog-content\") pod \"community-operators-l25kc\" (UID: \"7de40c05-aabd-43fb-b655-f19b5e81937d\") " pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.443660 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x88qz\" (UniqueName: \"kubernetes.io/projected/7de40c05-aabd-43fb-b655-f19b5e81937d-kube-api-access-x88qz\") pod \"community-operators-l25kc\" (UID: \"7de40c05-aabd-43fb-b655-f19b5e81937d\") " pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.443716 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7de40c05-aabd-43fb-b655-f19b5e81937d-utilities\") pod \"community-operators-l25kc\" (UID: \"7de40c05-aabd-43fb-b655-f19b5e81937d\") " pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.444119 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7de40c05-aabd-43fb-b655-f19b5e81937d-catalog-content\") pod \"community-operators-l25kc\" (UID: \"7de40c05-aabd-43fb-b655-f19b5e81937d\") " pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.444237 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7de40c05-aabd-43fb-b655-f19b5e81937d-utilities\") pod \"community-operators-l25kc\" (UID: \"7de40c05-aabd-43fb-b655-f19b5e81937d\") " pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.477857 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x88qz\" (UniqueName: \"kubernetes.io/projected/7de40c05-aabd-43fb-b655-f19b5e81937d-kube-api-access-x88qz\") pod \"community-operators-l25kc\" (UID: \"7de40c05-aabd-43fb-b655-f19b5e81937d\") " pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.638545 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-64dbc77f9f-pqx8x_9df0ae9c-f41a-4c92-b62e-ff0f230da65c/operator/0.log" Feb 20 18:12:36 crc kubenswrapper[4697]: I0220 18:12:36.645049 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:37 crc kubenswrapper[4697]: I0220 18:12:37.101924 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-4nlwj_b3087f12-ba44-4ef2-af22-3d77e30b1d84/registry-server/0.log" Feb 20 18:12:37 crc kubenswrapper[4697]: I0220 18:12:37.191138 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l25kc"] Feb 20 18:12:37 crc kubenswrapper[4697]: I0220 18:12:37.435795 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-lsgnq_55a77be1-9486-4b8a-acc6-a4d8532016d3/manager/0.log" Feb 20 18:12:37 crc kubenswrapper[4697]: I0220 18:12:37.625536 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-l5qll_8c8d2c10-e4b6-4d37-977f-7ad685981d2f/manager/0.log" Feb 20 18:12:37 crc kubenswrapper[4697]: I0220 18:12:37.791312 4697 generic.go:334] "Generic (PLEG): container finished" podID="7de40c05-aabd-43fb-b655-f19b5e81937d" containerID="09ab7f7c76dfcb2c9538093aa161ab861003a6218c228a970909bc0cff287a2a" exitCode=0 Feb 20 18:12:37 crc kubenswrapper[4697]: I0220 18:12:37.791665 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l25kc" event={"ID":"7de40c05-aabd-43fb-b655-f19b5e81937d","Type":"ContainerDied","Data":"09ab7f7c76dfcb2c9538093aa161ab861003a6218c228a970909bc0cff287a2a"} Feb 20 18:12:37 crc kubenswrapper[4697]: I0220 18:12:37.791696 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l25kc" event={"ID":"7de40c05-aabd-43fb-b655-f19b5e81937d","Type":"ContainerStarted","Data":"5d719bcac239ffb5d3117176305790279bf1bbf320a3a22630124a967d8b0019"} Feb 20 18:12:37 crc kubenswrapper[4697]: I0220 18:12:37.793485 4697 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 18:12:37 crc kubenswrapper[4697]: I0220 18:12:37.870207 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xnpms_8546e4ea-d7f0-4244-8496-e962809c4203/operator/0.log" Feb 20 18:12:38 crc kubenswrapper[4697]: I0220 18:12:38.148743 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-2mxvz_cb67b5ad-353e-4d96-8d94-fc69e4801f64/manager/0.log" Feb 20 18:12:38 crc kubenswrapper[4697]: I0220 18:12:38.593670 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-wjwzl_6af9a0a9-0546-4a59-bdca-1a0609421010/manager/0.log" Feb 20 18:12:38 crc kubenswrapper[4697]: I0220 18:12:38.740845 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-b89r4_a675eb01-18af-4776-94e6-64c0b392248b/manager/0.log" Feb 20 18:12:39 crc kubenswrapper[4697]: I0220 18:12:39.086394 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-9d9d9f9cd-t7nvz_d8c72591-eb8d-4553-867d-60482d51c4db/manager/0.log" Feb 20 18:12:39 crc kubenswrapper[4697]: I0220 18:12:39.172615 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-b45cc898b-82j7k_33e3fc43-dbfe-4fff-bac3-6021dfa84982/manager/0.log" Feb 20 18:12:39 crc kubenswrapper[4697]: I0220 18:12:39.286063 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-7jlc2_8b48098d-ef4c-4cde-beef-a7c34573699b/manager/0.log" Feb 20 18:12:42 crc kubenswrapper[4697]: I0220 18:12:42.831500 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l25kc" event={"ID":"7de40c05-aabd-43fb-b655-f19b5e81937d","Type":"ContainerStarted","Data":"22e8e842dd432776a5c71f13b4b5fed2ac55ec5ef4513c8d42641f4ec8bc3526"} Feb 20 18:12:44 crc kubenswrapper[4697]: I0220 18:12:44.595820 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-kqjzr_fcf00eef-940f-4da3-8359-325f1abb0c6d/manager/0.log" Feb 20 18:12:44 crc kubenswrapper[4697]: I0220 18:12:44.854023 4697 generic.go:334] "Generic (PLEG): container finished" podID="7de40c05-aabd-43fb-b655-f19b5e81937d" containerID="22e8e842dd432776a5c71f13b4b5fed2ac55ec5ef4513c8d42641f4ec8bc3526" exitCode=0 Feb 20 18:12:44 crc kubenswrapper[4697]: I0220 18:12:44.854111 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l25kc" event={"ID":"7de40c05-aabd-43fb-b655-f19b5e81937d","Type":"ContainerDied","Data":"22e8e842dd432776a5c71f13b4b5fed2ac55ec5ef4513c8d42641f4ec8bc3526"} Feb 20 18:12:45 crc kubenswrapper[4697]: I0220 18:12:45.865614 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l25kc" event={"ID":"7de40c05-aabd-43fb-b655-f19b5e81937d","Type":"ContainerStarted","Data":"0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf"} Feb 20 18:12:45 crc kubenswrapper[4697]: I0220 18:12:45.892924 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l25kc" podStartSLOduration=2.421422331 podStartE2EDuration="9.892904897s" podCreationTimestamp="2026-02-20 18:12:36 +0000 UTC" firstStartedPulling="2026-02-20 18:12:37.793185509 +0000 UTC m=+6065.573230917" lastFinishedPulling="2026-02-20 18:12:45.264668075 +0000 UTC m=+6073.044713483" observedRunningTime="2026-02-20 18:12:45.884263145 +0000 UTC m=+6073.664308553" watchObservedRunningTime="2026-02-20 18:12:45.892904897 +0000 UTC m=+6073.672950305" Feb 20 18:12:46 crc kubenswrapper[4697]: I0220 18:12:46.647568 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:46 crc kubenswrapper[4697]: I0220 18:12:46.647617 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:47 crc kubenswrapper[4697]: I0220 18:12:47.695227 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-l25kc" podUID="7de40c05-aabd-43fb-b655-f19b5e81937d" containerName="registry-server" probeResult="failure" output=< Feb 20 18:12:47 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 18:12:47 crc kubenswrapper[4697]: > Feb 20 18:12:56 crc kubenswrapper[4697]: I0220 18:12:56.701535 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:56 crc kubenswrapper[4697]: I0220 18:12:56.776934 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:56 crc kubenswrapper[4697]: I0220 18:12:56.945666 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l25kc"] Feb 20 18:12:57 crc kubenswrapper[4697]: I0220 18:12:57.968324 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l25kc" podUID="7de40c05-aabd-43fb-b655-f19b5e81937d" containerName="registry-server" containerID="cri-o://0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf" gracePeriod=2 Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.463633 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.582947 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7de40c05-aabd-43fb-b655-f19b5e81937d-catalog-content\") pod \"7de40c05-aabd-43fb-b655-f19b5e81937d\" (UID: \"7de40c05-aabd-43fb-b655-f19b5e81937d\") " Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.583014 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7de40c05-aabd-43fb-b655-f19b5e81937d-utilities\") pod \"7de40c05-aabd-43fb-b655-f19b5e81937d\" (UID: \"7de40c05-aabd-43fb-b655-f19b5e81937d\") " Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.583122 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x88qz\" (UniqueName: \"kubernetes.io/projected/7de40c05-aabd-43fb-b655-f19b5e81937d-kube-api-access-x88qz\") pod \"7de40c05-aabd-43fb-b655-f19b5e81937d\" (UID: \"7de40c05-aabd-43fb-b655-f19b5e81937d\") " Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.583618 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7de40c05-aabd-43fb-b655-f19b5e81937d-utilities" (OuterVolumeSpecName: "utilities") pod "7de40c05-aabd-43fb-b655-f19b5e81937d" (UID: "7de40c05-aabd-43fb-b655-f19b5e81937d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.583767 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7de40c05-aabd-43fb-b655-f19b5e81937d-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.596571 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de40c05-aabd-43fb-b655-f19b5e81937d-kube-api-access-x88qz" (OuterVolumeSpecName: "kube-api-access-x88qz") pod "7de40c05-aabd-43fb-b655-f19b5e81937d" (UID: "7de40c05-aabd-43fb-b655-f19b5e81937d"). InnerVolumeSpecName "kube-api-access-x88qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.638462 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7de40c05-aabd-43fb-b655-f19b5e81937d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7de40c05-aabd-43fb-b655-f19b5e81937d" (UID: "7de40c05-aabd-43fb-b655-f19b5e81937d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.686058 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7de40c05-aabd-43fb-b655-f19b5e81937d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.686086 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x88qz\" (UniqueName: \"kubernetes.io/projected/7de40c05-aabd-43fb-b655-f19b5e81937d-kube-api-access-x88qz\") on node \"crc\" DevicePath \"\"" Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.979658 4697 generic.go:334] "Generic (PLEG): container finished" podID="7de40c05-aabd-43fb-b655-f19b5e81937d" containerID="0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf" exitCode=0 Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.979711 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l25kc" event={"ID":"7de40c05-aabd-43fb-b655-f19b5e81937d","Type":"ContainerDied","Data":"0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf"} Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.979741 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l25kc" Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.980078 4697 scope.go:117] "RemoveContainer" containerID="0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf" Feb 20 18:12:58 crc kubenswrapper[4697]: I0220 18:12:58.980058 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l25kc" event={"ID":"7de40c05-aabd-43fb-b655-f19b5e81937d","Type":"ContainerDied","Data":"5d719bcac239ffb5d3117176305790279bf1bbf320a3a22630124a967d8b0019"} Feb 20 18:12:59 crc kubenswrapper[4697]: I0220 18:12:59.003516 4697 scope.go:117] "RemoveContainer" containerID="22e8e842dd432776a5c71f13b4b5fed2ac55ec5ef4513c8d42641f4ec8bc3526" Feb 20 18:12:59 crc kubenswrapper[4697]: I0220 18:12:59.009509 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l25kc"] Feb 20 18:12:59 crc kubenswrapper[4697]: I0220 18:12:59.033762 4697 scope.go:117] "RemoveContainer" containerID="09ab7f7c76dfcb2c9538093aa161ab861003a6218c228a970909bc0cff287a2a" Feb 20 18:12:59 crc kubenswrapper[4697]: I0220 18:12:59.035815 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l25kc"] Feb 20 18:12:59 crc kubenswrapper[4697]: I0220 18:12:59.148225 4697 scope.go:117] "RemoveContainer" containerID="0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf" Feb 20 18:12:59 crc kubenswrapper[4697]: E0220 18:12:59.148702 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf\": container with ID starting with 0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf not found: ID does not exist" containerID="0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf" Feb 20 18:12:59 crc kubenswrapper[4697]: I0220 18:12:59.148725 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf"} err="failed to get container status \"0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf\": rpc error: code = NotFound desc = could not find container \"0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf\": container with ID starting with 0d1573a4bed6218a1414933bfd5ff4f9e4a80bc9df83f8715b4f9906071fadaf not found: ID does not exist" Feb 20 18:12:59 crc kubenswrapper[4697]: I0220 18:12:59.148743 4697 scope.go:117] "RemoveContainer" containerID="22e8e842dd432776a5c71f13b4b5fed2ac55ec5ef4513c8d42641f4ec8bc3526" Feb 20 18:12:59 crc kubenswrapper[4697]: E0220 18:12:59.149168 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e8e842dd432776a5c71f13b4b5fed2ac55ec5ef4513c8d42641f4ec8bc3526\": container with ID starting with 22e8e842dd432776a5c71f13b4b5fed2ac55ec5ef4513c8d42641f4ec8bc3526 not found: ID does not exist" containerID="22e8e842dd432776a5c71f13b4b5fed2ac55ec5ef4513c8d42641f4ec8bc3526" Feb 20 18:12:59 crc kubenswrapper[4697]: I0220 18:12:59.149207 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e8e842dd432776a5c71f13b4b5fed2ac55ec5ef4513c8d42641f4ec8bc3526"} err="failed to get container status \"22e8e842dd432776a5c71f13b4b5fed2ac55ec5ef4513c8d42641f4ec8bc3526\": rpc error: code = NotFound desc = could not find container \"22e8e842dd432776a5c71f13b4b5fed2ac55ec5ef4513c8d42641f4ec8bc3526\": container with ID starting with 22e8e842dd432776a5c71f13b4b5fed2ac55ec5ef4513c8d42641f4ec8bc3526 not found: ID does not exist" Feb 20 18:12:59 crc kubenswrapper[4697]: I0220 18:12:59.149237 4697 scope.go:117] "RemoveContainer" containerID="09ab7f7c76dfcb2c9538093aa161ab861003a6218c228a970909bc0cff287a2a" Feb 20 18:12:59 crc kubenswrapper[4697]: E0220 18:12:59.149543 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ab7f7c76dfcb2c9538093aa161ab861003a6218c228a970909bc0cff287a2a\": container with ID starting with 09ab7f7c76dfcb2c9538093aa161ab861003a6218c228a970909bc0cff287a2a not found: ID does not exist" containerID="09ab7f7c76dfcb2c9538093aa161ab861003a6218c228a970909bc0cff287a2a" Feb 20 18:12:59 crc kubenswrapper[4697]: I0220 18:12:59.149563 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ab7f7c76dfcb2c9538093aa161ab861003a6218c228a970909bc0cff287a2a"} err="failed to get container status \"09ab7f7c76dfcb2c9538093aa161ab861003a6218c228a970909bc0cff287a2a\": rpc error: code = NotFound desc = could not find container \"09ab7f7c76dfcb2c9538093aa161ab861003a6218c228a970909bc0cff287a2a\": container with ID starting with 09ab7f7c76dfcb2c9538093aa161ab861003a6218c228a970909bc0cff287a2a not found: ID does not exist" Feb 20 18:13:00 crc kubenswrapper[4697]: I0220 18:13:00.670253 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dkznt_8912bf5c-045e-4c86-9a09-41c4cab10139/control-plane-machine-set-operator/0.log" Feb 20 18:13:00 crc kubenswrapper[4697]: I0220 18:13:00.846473 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f6d4m_ca43f4e8-ab22-4573-b5cb-5d58dbf788f1/kube-rbac-proxy/0.log" Feb 20 18:13:00 crc kubenswrapper[4697]: I0220 18:13:00.857423 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f6d4m_ca43f4e8-ab22-4573-b5cb-5d58dbf788f1/machine-api-operator/0.log" Feb 20 18:13:00 crc kubenswrapper[4697]: I0220 18:13:00.907354 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7de40c05-aabd-43fb-b655-f19b5e81937d" path="/var/lib/kubelet/pods/7de40c05-aabd-43fb-b655-f19b5e81937d/volumes" Feb 20 18:13:13 crc kubenswrapper[4697]: I0220 18:13:13.997476 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-x2rxq_ee3c17f3-a89c-49fa-8cf9-75e4914401cc/cert-manager-controller/0.log" Feb 20 18:13:14 crc kubenswrapper[4697]: I0220 18:13:14.122445 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-rs7j9_2c82f3c1-cc18-41b5-9489-61f52f31a74a/cert-manager-cainjector/0.log" Feb 20 18:13:14 crc kubenswrapper[4697]: I0220 18:13:14.218843 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-zd9ck_36e190bc-14ce-4ce6-92ac-d48197515527/cert-manager-webhook/0.log" Feb 20 18:13:27 crc kubenswrapper[4697]: I0220 18:13:27.604354 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-9x7l9_d88c9050-548c-4305-b532-085c83436b3e/nmstate-console-plugin/0.log" Feb 20 18:13:27 crc kubenswrapper[4697]: I0220 18:13:27.812593 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mmnb6_89578f54-097d-4b2b-9809-103034a3a114/nmstate-handler/0.log" Feb 20 18:13:27 crc kubenswrapper[4697]: I0220 18:13:27.854058 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-vwprb_7b12f1ce-9e0b-458b-990a-e38c2f2139c5/kube-rbac-proxy/0.log" Feb 20 18:13:27 crc kubenswrapper[4697]: I0220 18:13:27.951813 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-vwprb_7b12f1ce-9e0b-458b-990a-e38c2f2139c5/nmstate-metrics/0.log" Feb 20 18:13:28 crc kubenswrapper[4697]: I0220 18:13:28.022023 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-ljdjd_278cea78-f0e4-4785-bc49-aa335706ccac/nmstate-operator/0.log" Feb 20 18:13:28 crc kubenswrapper[4697]: I0220 18:13:28.157707 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-xmzrc_567d32fd-31d2-4822-b487-ec35c250663d/nmstate-webhook/0.log" Feb 20 18:13:31 crc kubenswrapper[4697]: I0220 18:13:31.184990 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 18:13:31 crc kubenswrapper[4697]: I0220 18:13:31.186584 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 18:13:42 crc kubenswrapper[4697]: I0220 18:13:42.013604 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-cqddv_214e6680-a3c8-4286-9c91-d68893ba73be/prometheus-operator/0.log" Feb 20 18:13:42 crc kubenswrapper[4697]: I0220 18:13:42.148275 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8679bd8497-phkqw_68c668ba-cbcc-4330-95e2-012c78108925/prometheus-operator-admission-webhook/0.log" Feb 20 18:13:42 crc kubenswrapper[4697]: I0220 18:13:42.243293 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5_565bbe20-d8eb-4878-a048-4d78d8123f6d/prometheus-operator-admission-webhook/0.log" Feb 20 18:13:42 crc kubenswrapper[4697]: I0220 18:13:42.317542 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8wmf5_12db7fd9-ce39-43cf-99b7-3a56791c0390/operator/0.log" Feb 20 18:13:42 crc kubenswrapper[4697]: I0220 18:13:42.603904 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-j5gtw_94f7dd38-b255-46cd-8b05-4c720857dd86/perses-operator/0.log" Feb 20 18:13:57 crc kubenswrapper[4697]: I0220 18:13:57.690560 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-64qzx_e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3/kube-rbac-proxy/0.log" Feb 20 18:13:57 crc kubenswrapper[4697]: I0220 18:13:57.815051 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-64qzx_e2a9bc57-0e8b-4eb5-a28c-109e164aa8d3/controller/0.log" Feb 20 18:13:57 crc kubenswrapper[4697]: I0220 18:13:57.898348 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-frr-files/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.196029 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-reloader/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.205497 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-metrics/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.211706 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-reloader/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.255289 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-frr-files/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.408696 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-frr-files/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.432178 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-metrics/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.439056 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-reloader/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.463341 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-metrics/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.617483 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-reloader/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.617495 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-metrics/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.642012 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/cp-frr-files/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.656115 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/controller/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.806357 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/kube-rbac-proxy/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.875766 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/frr-metrics/0.log" Feb 20 18:13:58 crc kubenswrapper[4697]: I0220 18:13:58.941364 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/kube-rbac-proxy-frr/0.log" Feb 20 18:13:59 crc kubenswrapper[4697]: I0220 18:13:59.065580 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/reloader/0.log" Feb 20 18:13:59 crc kubenswrapper[4697]: I0220 18:13:59.120746 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-lk6qx_08c8d96c-3974-4e6f-ae8e-7283a628643e/frr-k8s-webhook-server/0.log" Feb 20 18:13:59 crc kubenswrapper[4697]: I0220 18:13:59.345080 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6574bdbb48-rfq5g_538df488-7224-4d8b-a08c-463865282008/manager/0.log" Feb 20 18:13:59 crc kubenswrapper[4697]: I0220 18:13:59.531850 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-77767df4c8-b6xsv_50d070c4-1559-40ac-9375-96ddffeb6b1a/webhook-server/0.log" Feb 20 18:13:59 crc kubenswrapper[4697]: I0220 18:13:59.617018 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fzjzn_7d6d1b55-ac21-4967-9487-53b7b236b847/kube-rbac-proxy/0.log" Feb 20 18:14:00 crc kubenswrapper[4697]: I0220 18:14:00.257695 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fzjzn_7d6d1b55-ac21-4967-9487-53b7b236b847/speaker/0.log" Feb 20 18:14:00 crc kubenswrapper[4697]: I0220 18:14:00.397850 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rrd4g_531d9ac7-9f25-4e8b-ba58-b9f6e172c7b7/frr/0.log" Feb 20 18:14:01 crc kubenswrapper[4697]: I0220 18:14:01.185018 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 18:14:01 crc kubenswrapper[4697]: I0220 18:14:01.185388 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 18:14:14 crc kubenswrapper[4697]: I0220 18:14:14.135288 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/util/0.log" Feb 20 18:14:14 crc kubenswrapper[4697]: I0220 18:14:14.350851 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/util/0.log" Feb 20 18:14:14 crc kubenswrapper[4697]: I0220 18:14:14.361554 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/pull/0.log" Feb 20 18:14:14 crc kubenswrapper[4697]: I0220 18:14:14.412010 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/pull/0.log" Feb 20 18:14:14 crc kubenswrapper[4697]: I0220 18:14:14.585874 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/pull/0.log" Feb 20 18:14:14 crc kubenswrapper[4697]: I0220 18:14:14.601456 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/util/0.log" Feb 20 18:14:14 crc kubenswrapper[4697]: I0220 18:14:14.625268 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rpbkh_c9412f48-6077-4f90-8d2d-869512ab617d/extract/0.log" Feb 20 18:14:14 crc kubenswrapper[4697]: I0220 18:14:14.736504 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/util/0.log" Feb 20 18:14:14 crc kubenswrapper[4697]: I0220 18:14:14.937261 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/util/0.log" Feb 20 18:14:14 crc kubenswrapper[4697]: I0220 18:14:14.941197 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/pull/0.log" Feb 20 18:14:14 crc kubenswrapper[4697]: I0220 18:14:14.957499 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/pull/0.log" Feb 20 18:14:15 crc kubenswrapper[4697]: I0220 18:14:15.053949 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/util/0.log" Feb 20 18:14:15 crc kubenswrapper[4697]: I0220 18:14:15.101338 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/pull/0.log" Feb 20 18:14:15 crc kubenswrapper[4697]: I0220 18:14:15.143991 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ngx8f_53d4ee21-f1e0-4bea-b34f-6ff260c092cd/extract/0.log" Feb 20 18:14:15 crc kubenswrapper[4697]: I0220 18:14:15.226786 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/extract-utilities/0.log" Feb 20 18:14:15 crc kubenswrapper[4697]: I0220 18:14:15.430326 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/extract-content/0.log" Feb 20 18:14:15 crc kubenswrapper[4697]: I0220 18:14:15.433593 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/extract-content/0.log" Feb 20 18:14:15 crc kubenswrapper[4697]: I0220 18:14:15.463929 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/extract-utilities/0.log" Feb 20 18:14:15 crc kubenswrapper[4697]: I0220 18:14:15.611196 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/extract-utilities/0.log" Feb 20 18:14:15 crc kubenswrapper[4697]: I0220 18:14:15.642283 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/extract-content/0.log" Feb 20 18:14:15 crc kubenswrapper[4697]: I0220 18:14:15.823782 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/extract-utilities/0.log" Feb 20 18:14:16 crc kubenswrapper[4697]: I0220 18:14:16.030049 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/extract-content/0.log" Feb 20 18:14:16 crc kubenswrapper[4697]: I0220 18:14:16.057103 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/extract-utilities/0.log" Feb 20 18:14:16 crc kubenswrapper[4697]: I0220 18:14:16.154421 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/extract-content/0.log" Feb 20 18:14:16 crc kubenswrapper[4697]: I0220 18:14:16.312009 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/extract-content/0.log" Feb 20 18:14:16 crc kubenswrapper[4697]: I0220 18:14:16.322085 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/extract-utilities/0.log" Feb 20 18:14:16 crc kubenswrapper[4697]: I0220 18:14:16.337220 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qbll4_5c701576-ceb0-4bd0-9583-c7025ea0d061/registry-server/0.log" Feb 20 18:14:16 crc kubenswrapper[4697]: I0220 18:14:16.573032 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/util/0.log" Feb 20 18:14:16 crc kubenswrapper[4697]: I0220 18:14:16.755181 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/util/0.log" Feb 20 18:14:16 crc kubenswrapper[4697]: I0220 18:14:16.813873 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/pull/0.log" Feb 20 18:14:16 crc kubenswrapper[4697]: I0220 18:14:16.827173 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/pull/0.log" Feb 20 18:14:17 crc kubenswrapper[4697]: I0220 18:14:17.091933 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/util/0.log" Feb 20 18:14:17 crc kubenswrapper[4697]: I0220 18:14:17.095405 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/pull/0.log" Feb 20 18:14:17 crc kubenswrapper[4697]: I0220 18:14:17.184328 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecanldtl_2547a2de-95cc-4068-9dc8-6ac185ccd3af/extract/0.log" Feb 20 18:14:17 crc kubenswrapper[4697]: I0220 18:14:17.231098 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-czzb9_582272c0-0a61-44ac-886c-de82d766c32b/registry-server/0.log" Feb 20 18:14:17 crc kubenswrapper[4697]: I0220 18:14:17.340257 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-shppd_84dd2186-876f-440f-8187-51f7bdda1bb8/marketplace-operator/0.log" Feb 20 18:14:17 crc kubenswrapper[4697]: I0220 18:14:17.453374 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/extract-utilities/0.log" Feb 20 18:14:17 crc kubenswrapper[4697]: I0220 18:14:17.617531 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/extract-utilities/0.log" Feb 20 18:14:17 crc kubenswrapper[4697]: I0220 18:14:17.675625 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/extract-content/0.log" Feb 20 18:14:17 crc kubenswrapper[4697]: I0220 18:14:17.706369 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/extract-content/0.log" Feb 20 18:14:17 crc kubenswrapper[4697]: I0220 18:14:17.909775 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/extract-utilities/0.log" Feb 20 18:14:17 crc kubenswrapper[4697]: I0220 18:14:17.914230 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/extract-content/0.log" Feb 20 18:14:18 crc kubenswrapper[4697]: I0220 18:14:18.099632 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pr6bg_55542521-9cd2-46b2-ab39-9545c3e50fea/registry-server/0.log" Feb 20 18:14:18 crc kubenswrapper[4697]: I0220 18:14:18.147920 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/extract-utilities/0.log" Feb 20 18:14:18 crc kubenswrapper[4697]: I0220 18:14:18.293702 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/extract-content/0.log" Feb 20 18:14:18 crc kubenswrapper[4697]: I0220 18:14:18.329591 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/extract-utilities/0.log" Feb 20 18:14:18 crc kubenswrapper[4697]: I0220 18:14:18.370197 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/extract-content/0.log" Feb 20 18:14:18 crc kubenswrapper[4697]: I0220 18:14:18.528644 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/extract-utilities/0.log" Feb 20 18:14:18 crc kubenswrapper[4697]: I0220 18:14:18.540398 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/extract-content/0.log" Feb 20 18:14:19 crc kubenswrapper[4697]: I0220 18:14:19.198790 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ksnpf_df2bff45-36b1-4240-9828-29382414ea11/registry-server/0.log" Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.184606 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.185155 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.185429 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.186187 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2207295b9ca3dd56c27a6f62d8877531cf202677384869ec78ba90d1c35300e6"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.186245 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://2207295b9ca3dd56c27a6f62d8877531cf202677384869ec78ba90d1c35300e6" gracePeriod=600 Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.600917 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8679bd8497-rcmm5_565bbe20-d8eb-4878-a048-4d78d8123f6d/prometheus-operator-admission-webhook/0.log" Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.629531 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-cqddv_214e6680-a3c8-4286-9c91-d68893ba73be/prometheus-operator/0.log" Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.645516 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8679bd8497-phkqw_68c668ba-cbcc-4330-95e2-012c78108925/prometheus-operator-admission-webhook/0.log" Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.786995 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8wmf5_12db7fd9-ce39-43cf-99b7-3a56791c0390/operator/0.log" Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.820837 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-j5gtw_94f7dd38-b255-46cd-8b05-4c720857dd86/perses-operator/0.log" Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.941761 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="2207295b9ca3dd56c27a6f62d8877531cf202677384869ec78ba90d1c35300e6" exitCode=0 Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.941807 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"2207295b9ca3dd56c27a6f62d8877531cf202677384869ec78ba90d1c35300e6"} Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.942258 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerStarted","Data":"c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f"} Feb 20 18:14:31 crc kubenswrapper[4697]: I0220 18:14:31.942284 4697 scope.go:117] "RemoveContainer" containerID="e73cb7415468391a0a60e59cd9d4187f1b15cd8f39ccfb41417d83e7aa6d85a6" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.174056 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6"] Feb 20 18:15:00 crc kubenswrapper[4697]: E0220 18:15:00.175377 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de40c05-aabd-43fb-b655-f19b5e81937d" containerName="registry-server" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.175405 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de40c05-aabd-43fb-b655-f19b5e81937d" containerName="registry-server" Feb 20 18:15:00 crc kubenswrapper[4697]: E0220 18:15:00.175461 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de40c05-aabd-43fb-b655-f19b5e81937d" containerName="extract-utilities" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.175473 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de40c05-aabd-43fb-b655-f19b5e81937d" containerName="extract-utilities" Feb 20 18:15:00 crc kubenswrapper[4697]: E0220 18:15:00.175502 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de40c05-aabd-43fb-b655-f19b5e81937d" containerName="extract-content" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.175515 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de40c05-aabd-43fb-b655-f19b5e81937d" containerName="extract-content" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.175819 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de40c05-aabd-43fb-b655-f19b5e81937d" containerName="registry-server" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.176905 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.180610 4697 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.181241 4697 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.204647 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6"] Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.318509 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp5sm\" (UniqueName: \"kubernetes.io/projected/f5c47dcd-34ce-4c29-90a0-8700758fae42-kube-api-access-zp5sm\") pod \"collect-profiles-29526855-lbjq6\" (UID: \"f5c47dcd-34ce-4c29-90a0-8700758fae42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.318686 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5c47dcd-34ce-4c29-90a0-8700758fae42-secret-volume\") pod \"collect-profiles-29526855-lbjq6\" (UID: \"f5c47dcd-34ce-4c29-90a0-8700758fae42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.318718 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5c47dcd-34ce-4c29-90a0-8700758fae42-config-volume\") pod \"collect-profiles-29526855-lbjq6\" (UID: \"f5c47dcd-34ce-4c29-90a0-8700758fae42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.420806 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp5sm\" (UniqueName: \"kubernetes.io/projected/f5c47dcd-34ce-4c29-90a0-8700758fae42-kube-api-access-zp5sm\") pod \"collect-profiles-29526855-lbjq6\" (UID: \"f5c47dcd-34ce-4c29-90a0-8700758fae42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.420943 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5c47dcd-34ce-4c29-90a0-8700758fae42-secret-volume\") pod \"collect-profiles-29526855-lbjq6\" (UID: \"f5c47dcd-34ce-4c29-90a0-8700758fae42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.420985 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5c47dcd-34ce-4c29-90a0-8700758fae42-config-volume\") pod \"collect-profiles-29526855-lbjq6\" (UID: \"f5c47dcd-34ce-4c29-90a0-8700758fae42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.422078 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5c47dcd-34ce-4c29-90a0-8700758fae42-config-volume\") pod \"collect-profiles-29526855-lbjq6\" (UID: \"f5c47dcd-34ce-4c29-90a0-8700758fae42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.439374 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5c47dcd-34ce-4c29-90a0-8700758fae42-secret-volume\") pod \"collect-profiles-29526855-lbjq6\" (UID: \"f5c47dcd-34ce-4c29-90a0-8700758fae42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.439590 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp5sm\" (UniqueName: \"kubernetes.io/projected/f5c47dcd-34ce-4c29-90a0-8700758fae42-kube-api-access-zp5sm\") pod \"collect-profiles-29526855-lbjq6\" (UID: \"f5c47dcd-34ce-4c29-90a0-8700758fae42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:00 crc kubenswrapper[4697]: I0220 18:15:00.533795 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:01 crc kubenswrapper[4697]: I0220 18:15:01.040345 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6"] Feb 20 18:15:01 crc kubenswrapper[4697]: I0220 18:15:01.238576 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" event={"ID":"f5c47dcd-34ce-4c29-90a0-8700758fae42","Type":"ContainerStarted","Data":"e3ace5331816de2bd42f131272a61a2b9d19e890c7672d081a53e13fc850c83f"} Feb 20 18:15:02 crc kubenswrapper[4697]: I0220 18:15:02.255517 4697 generic.go:334] "Generic (PLEG): container finished" podID="f5c47dcd-34ce-4c29-90a0-8700758fae42" containerID="fd6641946867df01120323d2b7ee8a9a11b69ef2eaf629fff9debaf88026ee9a" exitCode=0 Feb 20 18:15:02 crc kubenswrapper[4697]: I0220 18:15:02.255743 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" event={"ID":"f5c47dcd-34ce-4c29-90a0-8700758fae42","Type":"ContainerDied","Data":"fd6641946867df01120323d2b7ee8a9a11b69ef2eaf629fff9debaf88026ee9a"} Feb 20 18:15:03 crc kubenswrapper[4697]: I0220 18:15:03.659698 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:03 crc kubenswrapper[4697]: I0220 18:15:03.804061 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5c47dcd-34ce-4c29-90a0-8700758fae42-secret-volume\") pod \"f5c47dcd-34ce-4c29-90a0-8700758fae42\" (UID: \"f5c47dcd-34ce-4c29-90a0-8700758fae42\") " Feb 20 18:15:03 crc kubenswrapper[4697]: I0220 18:15:03.804312 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp5sm\" (UniqueName: \"kubernetes.io/projected/f5c47dcd-34ce-4c29-90a0-8700758fae42-kube-api-access-zp5sm\") pod \"f5c47dcd-34ce-4c29-90a0-8700758fae42\" (UID: \"f5c47dcd-34ce-4c29-90a0-8700758fae42\") " Feb 20 18:15:03 crc kubenswrapper[4697]: I0220 18:15:03.804346 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5c47dcd-34ce-4c29-90a0-8700758fae42-config-volume\") pod \"f5c47dcd-34ce-4c29-90a0-8700758fae42\" (UID: \"f5c47dcd-34ce-4c29-90a0-8700758fae42\") " Feb 20 18:15:03 crc kubenswrapper[4697]: I0220 18:15:03.805519 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c47dcd-34ce-4c29-90a0-8700758fae42-config-volume" (OuterVolumeSpecName: "config-volume") pod "f5c47dcd-34ce-4c29-90a0-8700758fae42" (UID: "f5c47dcd-34ce-4c29-90a0-8700758fae42"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 18:15:03 crc kubenswrapper[4697]: I0220 18:15:03.811917 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5c47dcd-34ce-4c29-90a0-8700758fae42-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f5c47dcd-34ce-4c29-90a0-8700758fae42" (UID: "f5c47dcd-34ce-4c29-90a0-8700758fae42"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 18:15:03 crc kubenswrapper[4697]: I0220 18:15:03.812292 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c47dcd-34ce-4c29-90a0-8700758fae42-kube-api-access-zp5sm" (OuterVolumeSpecName: "kube-api-access-zp5sm") pod "f5c47dcd-34ce-4c29-90a0-8700758fae42" (UID: "f5c47dcd-34ce-4c29-90a0-8700758fae42"). InnerVolumeSpecName "kube-api-access-zp5sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:15:03 crc kubenswrapper[4697]: I0220 18:15:03.906843 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp5sm\" (UniqueName: \"kubernetes.io/projected/f5c47dcd-34ce-4c29-90a0-8700758fae42-kube-api-access-zp5sm\") on node \"crc\" DevicePath \"\"" Feb 20 18:15:03 crc kubenswrapper[4697]: I0220 18:15:03.906883 4697 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5c47dcd-34ce-4c29-90a0-8700758fae42-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 18:15:03 crc kubenswrapper[4697]: I0220 18:15:03.906892 4697 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5c47dcd-34ce-4c29-90a0-8700758fae42-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 18:15:04 crc kubenswrapper[4697]: I0220 18:15:04.281256 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" event={"ID":"f5c47dcd-34ce-4c29-90a0-8700758fae42","Type":"ContainerDied","Data":"e3ace5331816de2bd42f131272a61a2b9d19e890c7672d081a53e13fc850c83f"} Feb 20 18:15:04 crc kubenswrapper[4697]: I0220 18:15:04.281621 4697 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3ace5331816de2bd42f131272a61a2b9d19e890c7672d081a53e13fc850c83f" Feb 20 18:15:04 crc kubenswrapper[4697]: I0220 18:15:04.281351 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526855-lbjq6" Feb 20 18:15:04 crc kubenswrapper[4697]: I0220 18:15:04.742265 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg"] Feb 20 18:15:04 crc kubenswrapper[4697]: I0220 18:15:04.751324 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526810-4tlgg"] Feb 20 18:15:04 crc kubenswrapper[4697]: I0220 18:15:04.914703 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d241f3-bb64-4792-9b76-be72fefb2c4a" path="/var/lib/kubelet/pods/31d241f3-bb64-4792-9b76-be72fefb2c4a/volumes" Feb 20 18:15:48 crc kubenswrapper[4697]: I0220 18:15:48.316649 4697 scope.go:117] "RemoveContainer" containerID="3ce9fb2c1c631ff55e3ac3151f23ac5d871102a94c4ea94e42152e4ad4a4c8bc" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.022197 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q2jkm"] Feb 20 18:15:52 crc kubenswrapper[4697]: E0220 18:15:52.024899 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c47dcd-34ce-4c29-90a0-8700758fae42" containerName="collect-profiles" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.024924 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c47dcd-34ce-4c29-90a0-8700758fae42" containerName="collect-profiles" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.025245 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c47dcd-34ce-4c29-90a0-8700758fae42" containerName="collect-profiles" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.027698 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.039417 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q2jkm"] Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.063616 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e17598b-7e1d-4153-a2e4-f98d97d25b81-utilities\") pod \"certified-operators-q2jkm\" (UID: \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\") " pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.063849 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9bmz\" (UniqueName: \"kubernetes.io/projected/1e17598b-7e1d-4153-a2e4-f98d97d25b81-kube-api-access-k9bmz\") pod \"certified-operators-q2jkm\" (UID: \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\") " pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.063915 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e17598b-7e1d-4153-a2e4-f98d97d25b81-catalog-content\") pod \"certified-operators-q2jkm\" (UID: \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\") " pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.166119 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e17598b-7e1d-4153-a2e4-f98d97d25b81-utilities\") pod \"certified-operators-q2jkm\" (UID: \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\") " pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.166317 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9bmz\" (UniqueName: \"kubernetes.io/projected/1e17598b-7e1d-4153-a2e4-f98d97d25b81-kube-api-access-k9bmz\") pod \"certified-operators-q2jkm\" (UID: \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\") " pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.166371 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e17598b-7e1d-4153-a2e4-f98d97d25b81-catalog-content\") pod \"certified-operators-q2jkm\" (UID: \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\") " pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.166989 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e17598b-7e1d-4153-a2e4-f98d97d25b81-utilities\") pod \"certified-operators-q2jkm\" (UID: \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\") " pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.167507 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e17598b-7e1d-4153-a2e4-f98d97d25b81-catalog-content\") pod \"certified-operators-q2jkm\" (UID: \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\") " pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.198500 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9bmz\" (UniqueName: \"kubernetes.io/projected/1e17598b-7e1d-4153-a2e4-f98d97d25b81-kube-api-access-k9bmz\") pod \"certified-operators-q2jkm\" (UID: \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\") " pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.392057 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:15:52 crc kubenswrapper[4697]: I0220 18:15:52.965190 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q2jkm"] Feb 20 18:15:53 crc kubenswrapper[4697]: I0220 18:15:53.830266 4697 generic.go:334] "Generic (PLEG): container finished" podID="1e17598b-7e1d-4153-a2e4-f98d97d25b81" containerID="f1373f17ade779c826c1ef9e8d6941d70267ed24b46f11f35a0d198ac1f24a6b" exitCode=0 Feb 20 18:15:53 crc kubenswrapper[4697]: I0220 18:15:53.830587 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2jkm" event={"ID":"1e17598b-7e1d-4153-a2e4-f98d97d25b81","Type":"ContainerDied","Data":"f1373f17ade779c826c1ef9e8d6941d70267ed24b46f11f35a0d198ac1f24a6b"} Feb 20 18:15:53 crc kubenswrapper[4697]: I0220 18:15:53.830612 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2jkm" event={"ID":"1e17598b-7e1d-4153-a2e4-f98d97d25b81","Type":"ContainerStarted","Data":"4b7730fecc2463cdcf389508ade6c0dd488b04fa73a2feaffb5a4a026d2bcb75"} Feb 20 18:15:55 crc kubenswrapper[4697]: I0220 18:15:55.862565 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2jkm" event={"ID":"1e17598b-7e1d-4153-a2e4-f98d97d25b81","Type":"ContainerStarted","Data":"c013f80e0219746b800334aa87e7fd57adce04b096bcd33a59d2a29e4372267c"} Feb 20 18:15:56 crc kubenswrapper[4697]: I0220 18:15:56.873192 4697 generic.go:334] "Generic (PLEG): container finished" podID="1e17598b-7e1d-4153-a2e4-f98d97d25b81" containerID="c013f80e0219746b800334aa87e7fd57adce04b096bcd33a59d2a29e4372267c" exitCode=0 Feb 20 18:15:56 crc kubenswrapper[4697]: I0220 18:15:56.873258 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2jkm" event={"ID":"1e17598b-7e1d-4153-a2e4-f98d97d25b81","Type":"ContainerDied","Data":"c013f80e0219746b800334aa87e7fd57adce04b096bcd33a59d2a29e4372267c"} Feb 20 18:15:57 crc kubenswrapper[4697]: I0220 18:15:57.884539 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2jkm" event={"ID":"1e17598b-7e1d-4153-a2e4-f98d97d25b81","Type":"ContainerStarted","Data":"f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c"} Feb 20 18:15:57 crc kubenswrapper[4697]: I0220 18:15:57.937278 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q2jkm" podStartSLOduration=3.24313423 podStartE2EDuration="6.93723846s" podCreationTimestamp="2026-02-20 18:15:51 +0000 UTC" firstStartedPulling="2026-02-20 18:15:53.832291192 +0000 UTC m=+6261.612336600" lastFinishedPulling="2026-02-20 18:15:57.526395422 +0000 UTC m=+6265.306440830" observedRunningTime="2026-02-20 18:15:57.928912722 +0000 UTC m=+6265.708958140" watchObservedRunningTime="2026-02-20 18:15:57.93723846 +0000 UTC m=+6265.717283908" Feb 20 18:16:02 crc kubenswrapper[4697]: I0220 18:16:02.392853 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:16:02 crc kubenswrapper[4697]: I0220 18:16:02.393462 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:16:02 crc kubenswrapper[4697]: I0220 18:16:02.460123 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:16:03 crc kubenswrapper[4697]: I0220 18:16:03.017473 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:16:03 crc kubenswrapper[4697]: I0220 18:16:03.076973 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q2jkm"] Feb 20 18:16:04 crc kubenswrapper[4697]: I0220 18:16:04.961040 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q2jkm" podUID="1e17598b-7e1d-4153-a2e4-f98d97d25b81" containerName="registry-server" containerID="cri-o://f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c" gracePeriod=2 Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.472571 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.565261 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e17598b-7e1d-4153-a2e4-f98d97d25b81-utilities\") pod \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\" (UID: \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\") " Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.565340 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e17598b-7e1d-4153-a2e4-f98d97d25b81-catalog-content\") pod \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\" (UID: \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\") " Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.565456 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9bmz\" (UniqueName: \"kubernetes.io/projected/1e17598b-7e1d-4153-a2e4-f98d97d25b81-kube-api-access-k9bmz\") pod \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\" (UID: \"1e17598b-7e1d-4153-a2e4-f98d97d25b81\") " Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.567228 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e17598b-7e1d-4153-a2e4-f98d97d25b81-utilities" (OuterVolumeSpecName: "utilities") pod "1e17598b-7e1d-4153-a2e4-f98d97d25b81" (UID: "1e17598b-7e1d-4153-a2e4-f98d97d25b81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.598822 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e17598b-7e1d-4153-a2e4-f98d97d25b81-kube-api-access-k9bmz" (OuterVolumeSpecName: "kube-api-access-k9bmz") pod "1e17598b-7e1d-4153-a2e4-f98d97d25b81" (UID: "1e17598b-7e1d-4153-a2e4-f98d97d25b81"). InnerVolumeSpecName "kube-api-access-k9bmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.667633 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e17598b-7e1d-4153-a2e4-f98d97d25b81-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.667673 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9bmz\" (UniqueName: \"kubernetes.io/projected/1e17598b-7e1d-4153-a2e4-f98d97d25b81-kube-api-access-k9bmz\") on node \"crc\" DevicePath \"\"" Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.855031 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e17598b-7e1d-4153-a2e4-f98d97d25b81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e17598b-7e1d-4153-a2e4-f98d97d25b81" (UID: "1e17598b-7e1d-4153-a2e4-f98d97d25b81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.871992 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e17598b-7e1d-4153-a2e4-f98d97d25b81-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.974835 4697 generic.go:334] "Generic (PLEG): container finished" podID="1e17598b-7e1d-4153-a2e4-f98d97d25b81" containerID="f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c" exitCode=0 Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.974943 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2jkm" event={"ID":"1e17598b-7e1d-4153-a2e4-f98d97d25b81","Type":"ContainerDied","Data":"f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c"} Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.974972 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2jkm" event={"ID":"1e17598b-7e1d-4153-a2e4-f98d97d25b81","Type":"ContainerDied","Data":"4b7730fecc2463cdcf389508ade6c0dd488b04fa73a2feaffb5a4a026d2bcb75"} Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.974991 4697 scope.go:117] "RemoveContainer" containerID="f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c" Feb 20 18:16:05 crc kubenswrapper[4697]: I0220 18:16:05.975223 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q2jkm" Feb 20 18:16:06 crc kubenswrapper[4697]: I0220 18:16:06.005921 4697 scope.go:117] "RemoveContainer" containerID="c013f80e0219746b800334aa87e7fd57adce04b096bcd33a59d2a29e4372267c" Feb 20 18:16:06 crc kubenswrapper[4697]: I0220 18:16:06.016697 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q2jkm"] Feb 20 18:16:06 crc kubenswrapper[4697]: I0220 18:16:06.026461 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q2jkm"] Feb 20 18:16:06 crc kubenswrapper[4697]: I0220 18:16:06.041115 4697 scope.go:117] "RemoveContainer" containerID="f1373f17ade779c826c1ef9e8d6941d70267ed24b46f11f35a0d198ac1f24a6b" Feb 20 18:16:06 crc kubenswrapper[4697]: I0220 18:16:06.096998 4697 scope.go:117] "RemoveContainer" containerID="f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c" Feb 20 18:16:06 crc kubenswrapper[4697]: E0220 18:16:06.097840 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c\": container with ID starting with f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c not found: ID does not exist" containerID="f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c" Feb 20 18:16:06 crc kubenswrapper[4697]: I0220 18:16:06.097898 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c"} err="failed to get container status \"f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c\": rpc error: code = NotFound desc = could not find container \"f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c\": container with ID starting with f6c07a73414512b5b94d024e5ac1f8ad45ff246b29a41f185ad30692cc02630c not found: ID does not exist" Feb 20 18:16:06 crc kubenswrapper[4697]: I0220 18:16:06.097929 4697 scope.go:117] "RemoveContainer" containerID="c013f80e0219746b800334aa87e7fd57adce04b096bcd33a59d2a29e4372267c" Feb 20 18:16:06 crc kubenswrapper[4697]: E0220 18:16:06.098419 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c013f80e0219746b800334aa87e7fd57adce04b096bcd33a59d2a29e4372267c\": container with ID starting with c013f80e0219746b800334aa87e7fd57adce04b096bcd33a59d2a29e4372267c not found: ID does not exist" containerID="c013f80e0219746b800334aa87e7fd57adce04b096bcd33a59d2a29e4372267c" Feb 20 18:16:06 crc kubenswrapper[4697]: I0220 18:16:06.098468 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c013f80e0219746b800334aa87e7fd57adce04b096bcd33a59d2a29e4372267c"} err="failed to get container status \"c013f80e0219746b800334aa87e7fd57adce04b096bcd33a59d2a29e4372267c\": rpc error: code = NotFound desc = could not find container \"c013f80e0219746b800334aa87e7fd57adce04b096bcd33a59d2a29e4372267c\": container with ID starting with c013f80e0219746b800334aa87e7fd57adce04b096bcd33a59d2a29e4372267c not found: ID does not exist" Feb 20 18:16:06 crc kubenswrapper[4697]: I0220 18:16:06.098496 4697 scope.go:117] "RemoveContainer" containerID="f1373f17ade779c826c1ef9e8d6941d70267ed24b46f11f35a0d198ac1f24a6b" Feb 20 18:16:06 crc kubenswrapper[4697]: E0220 18:16:06.098890 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1373f17ade779c826c1ef9e8d6941d70267ed24b46f11f35a0d198ac1f24a6b\": container with ID starting with f1373f17ade779c826c1ef9e8d6941d70267ed24b46f11f35a0d198ac1f24a6b not found: ID does not exist" containerID="f1373f17ade779c826c1ef9e8d6941d70267ed24b46f11f35a0d198ac1f24a6b" Feb 20 18:16:06 crc kubenswrapper[4697]: I0220 18:16:06.098916 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1373f17ade779c826c1ef9e8d6941d70267ed24b46f11f35a0d198ac1f24a6b"} err="failed to get container status \"f1373f17ade779c826c1ef9e8d6941d70267ed24b46f11f35a0d198ac1f24a6b\": rpc error: code = NotFound desc = could not find container \"f1373f17ade779c826c1ef9e8d6941d70267ed24b46f11f35a0d198ac1f24a6b\": container with ID starting with f1373f17ade779c826c1ef9e8d6941d70267ed24b46f11f35a0d198ac1f24a6b not found: ID does not exist" Feb 20 18:16:06 crc kubenswrapper[4697]: I0220 18:16:06.893120 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e17598b-7e1d-4153-a2e4-f98d97d25b81" path="/var/lib/kubelet/pods/1e17598b-7e1d-4153-a2e4-f98d97d25b81/volumes" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.208682 4697 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-54n2r"] Feb 20 18:16:25 crc kubenswrapper[4697]: E0220 18:16:25.210097 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e17598b-7e1d-4153-a2e4-f98d97d25b81" containerName="registry-server" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.210117 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e17598b-7e1d-4153-a2e4-f98d97d25b81" containerName="registry-server" Feb 20 18:16:25 crc kubenswrapper[4697]: E0220 18:16:25.210151 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e17598b-7e1d-4153-a2e4-f98d97d25b81" containerName="extract-content" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.210162 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e17598b-7e1d-4153-a2e4-f98d97d25b81" containerName="extract-content" Feb 20 18:16:25 crc kubenswrapper[4697]: E0220 18:16:25.210176 4697 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e17598b-7e1d-4153-a2e4-f98d97d25b81" containerName="extract-utilities" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.210188 4697 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e17598b-7e1d-4153-a2e4-f98d97d25b81" containerName="extract-utilities" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.210504 4697 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e17598b-7e1d-4153-a2e4-f98d97d25b81" containerName="registry-server" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.212710 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.253489 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-54n2r"] Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.329810 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlc5w\" (UniqueName: \"kubernetes.io/projected/4cdd46e0-f93a-4157-a5b6-194c3aee0664-kube-api-access-dlc5w\") pod \"redhat-operators-54n2r\" (UID: \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\") " pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.330598 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cdd46e0-f93a-4157-a5b6-194c3aee0664-catalog-content\") pod \"redhat-operators-54n2r\" (UID: \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\") " pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.330631 4697 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cdd46e0-f93a-4157-a5b6-194c3aee0664-utilities\") pod \"redhat-operators-54n2r\" (UID: \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\") " pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.433557 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlc5w\" (UniqueName: \"kubernetes.io/projected/4cdd46e0-f93a-4157-a5b6-194c3aee0664-kube-api-access-dlc5w\") pod \"redhat-operators-54n2r\" (UID: \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\") " pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.433972 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cdd46e0-f93a-4157-a5b6-194c3aee0664-utilities\") pod \"redhat-operators-54n2r\" (UID: \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\") " pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.433998 4697 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cdd46e0-f93a-4157-a5b6-194c3aee0664-catalog-content\") pod \"redhat-operators-54n2r\" (UID: \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\") " pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.434520 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cdd46e0-f93a-4157-a5b6-194c3aee0664-catalog-content\") pod \"redhat-operators-54n2r\" (UID: \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\") " pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.434812 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cdd46e0-f93a-4157-a5b6-194c3aee0664-utilities\") pod \"redhat-operators-54n2r\" (UID: \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\") " pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.453795 4697 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlc5w\" (UniqueName: \"kubernetes.io/projected/4cdd46e0-f93a-4157-a5b6-194c3aee0664-kube-api-access-dlc5w\") pod \"redhat-operators-54n2r\" (UID: \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\") " pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:25 crc kubenswrapper[4697]: I0220 18:16:25.540841 4697 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:26 crc kubenswrapper[4697]: I0220 18:16:26.023625 4697 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-54n2r"] Feb 20 18:16:26 crc kubenswrapper[4697]: W0220 18:16:26.034987 4697 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cdd46e0_f93a_4157_a5b6_194c3aee0664.slice/crio-9d153f43eed25e404a53ad8508a21c173418f2569bb782116cc90c6b45fef9fe WatchSource:0}: Error finding container 9d153f43eed25e404a53ad8508a21c173418f2569bb782116cc90c6b45fef9fe: Status 404 returned error can't find the container with id 9d153f43eed25e404a53ad8508a21c173418f2569bb782116cc90c6b45fef9fe Feb 20 18:16:26 crc kubenswrapper[4697]: I0220 18:16:26.217307 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54n2r" event={"ID":"4cdd46e0-f93a-4157-a5b6-194c3aee0664","Type":"ContainerStarted","Data":"9d153f43eed25e404a53ad8508a21c173418f2569bb782116cc90c6b45fef9fe"} Feb 20 18:16:27 crc kubenswrapper[4697]: I0220 18:16:27.230688 4697 generic.go:334] "Generic (PLEG): container finished" podID="4cdd46e0-f93a-4157-a5b6-194c3aee0664" containerID="300736777ca28b4825ff116ae6243472368e6ae406b20ddacc6f76bb1abae213" exitCode=0 Feb 20 18:16:27 crc kubenswrapper[4697]: I0220 18:16:27.231026 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54n2r" event={"ID":"4cdd46e0-f93a-4157-a5b6-194c3aee0664","Type":"ContainerDied","Data":"300736777ca28b4825ff116ae6243472368e6ae406b20ddacc6f76bb1abae213"} Feb 20 18:16:29 crc kubenswrapper[4697]: I0220 18:16:29.258863 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54n2r" event={"ID":"4cdd46e0-f93a-4157-a5b6-194c3aee0664","Type":"ContainerStarted","Data":"9c47a7abc4cc4cdf5b0c1e8b76cdfe4d47560858c7ccb1ad51e4fc532e773685"} Feb 20 18:16:31 crc kubenswrapper[4697]: I0220 18:16:31.184710 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 18:16:31 crc kubenswrapper[4697]: I0220 18:16:31.185172 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 18:16:33 crc kubenswrapper[4697]: I0220 18:16:33.332624 4697 generic.go:334] "Generic (PLEG): container finished" podID="4cdd46e0-f93a-4157-a5b6-194c3aee0664" containerID="9c47a7abc4cc4cdf5b0c1e8b76cdfe4d47560858c7ccb1ad51e4fc532e773685" exitCode=0 Feb 20 18:16:33 crc kubenswrapper[4697]: I0220 18:16:33.332734 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54n2r" event={"ID":"4cdd46e0-f93a-4157-a5b6-194c3aee0664","Type":"ContainerDied","Data":"9c47a7abc4cc4cdf5b0c1e8b76cdfe4d47560858c7ccb1ad51e4fc532e773685"} Feb 20 18:16:34 crc kubenswrapper[4697]: I0220 18:16:34.346051 4697 generic.go:334] "Generic (PLEG): container finished" podID="d852f86a-ca7e-49f4-b223-af8574601e18" containerID="3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20" exitCode=0 Feb 20 18:16:34 crc kubenswrapper[4697]: I0220 18:16:34.346163 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nbm5z/must-gather-xnv6s" event={"ID":"d852f86a-ca7e-49f4-b223-af8574601e18","Type":"ContainerDied","Data":"3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20"} Feb 20 18:16:34 crc kubenswrapper[4697]: I0220 18:16:34.347130 4697 scope.go:117] "RemoveContainer" containerID="3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20" Feb 20 18:16:34 crc kubenswrapper[4697]: I0220 18:16:34.348970 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54n2r" event={"ID":"4cdd46e0-f93a-4157-a5b6-194c3aee0664","Type":"ContainerStarted","Data":"2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e"} Feb 20 18:16:34 crc kubenswrapper[4697]: I0220 18:16:34.972173 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nbm5z_must-gather-xnv6s_d852f86a-ca7e-49f4-b223-af8574601e18/gather/0.log" Feb 20 18:16:35 crc kubenswrapper[4697]: I0220 18:16:35.541160 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:35 crc kubenswrapper[4697]: I0220 18:16:35.541208 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:36 crc kubenswrapper[4697]: I0220 18:16:36.621945 4697 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-54n2r" podUID="4cdd46e0-f93a-4157-a5b6-194c3aee0664" containerName="registry-server" probeResult="failure" output=< Feb 20 18:16:36 crc kubenswrapper[4697]: timeout: failed to connect service ":50051" within 1s Feb 20 18:16:36 crc kubenswrapper[4697]: > Feb 20 18:16:45 crc kubenswrapper[4697]: I0220 18:16:45.594096 4697 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:45 crc kubenswrapper[4697]: I0220 18:16:45.623730 4697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-54n2r" podStartSLOduration=14.04288979 podStartE2EDuration="20.62371404s" podCreationTimestamp="2026-02-20 18:16:25 +0000 UTC" firstStartedPulling="2026-02-20 18:16:27.233177982 +0000 UTC m=+6295.013223430" lastFinishedPulling="2026-02-20 18:16:33.814002272 +0000 UTC m=+6301.594047680" observedRunningTime="2026-02-20 18:16:34.384662869 +0000 UTC m=+6302.164708287" watchObservedRunningTime="2026-02-20 18:16:45.62371404 +0000 UTC m=+6313.403759448" Feb 20 18:16:45 crc kubenswrapper[4697]: I0220 18:16:45.648232 4697 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:45 crc kubenswrapper[4697]: I0220 18:16:45.839605 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-54n2r"] Feb 20 18:16:47 crc kubenswrapper[4697]: I0220 18:16:47.509054 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-54n2r" podUID="4cdd46e0-f93a-4157-a5b6-194c3aee0664" containerName="registry-server" containerID="cri-o://2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e" gracePeriod=2 Feb 20 18:16:47 crc kubenswrapper[4697]: I0220 18:16:47.739061 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nbm5z/must-gather-xnv6s"] Feb 20 18:16:47 crc kubenswrapper[4697]: I0220 18:16:47.739359 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-nbm5z/must-gather-xnv6s" podUID="d852f86a-ca7e-49f4-b223-af8574601e18" containerName="copy" containerID="cri-o://3bac8c082aa836d42d710220d2eea3c4c71d1485df8d1a65d7aa4cdea6970f95" gracePeriod=2 Feb 20 18:16:47 crc kubenswrapper[4697]: I0220 18:16:47.760456 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nbm5z/must-gather-xnv6s"] Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.107198 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.190618 4697 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nbm5z_must-gather-xnv6s_d852f86a-ca7e-49f4-b223-af8574601e18/copy/0.log" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.191682 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/must-gather-xnv6s" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.220695 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cdd46e0-f93a-4157-a5b6-194c3aee0664-utilities\") pod \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\" (UID: \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\") " Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.221423 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlc5w\" (UniqueName: \"kubernetes.io/projected/4cdd46e0-f93a-4157-a5b6-194c3aee0664-kube-api-access-dlc5w\") pod \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\" (UID: \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\") " Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.221662 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cdd46e0-f93a-4157-a5b6-194c3aee0664-catalog-content\") pod \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\" (UID: \"4cdd46e0-f93a-4157-a5b6-194c3aee0664\") " Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.222125 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cdd46e0-f93a-4157-a5b6-194c3aee0664-utilities" (OuterVolumeSpecName: "utilities") pod "4cdd46e0-f93a-4157-a5b6-194c3aee0664" (UID: "4cdd46e0-f93a-4157-a5b6-194c3aee0664"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.222924 4697 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cdd46e0-f93a-4157-a5b6-194c3aee0664-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.228300 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cdd46e0-f93a-4157-a5b6-194c3aee0664-kube-api-access-dlc5w" (OuterVolumeSpecName: "kube-api-access-dlc5w") pod "4cdd46e0-f93a-4157-a5b6-194c3aee0664" (UID: "4cdd46e0-f93a-4157-a5b6-194c3aee0664"). InnerVolumeSpecName "kube-api-access-dlc5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.324031 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkgg6\" (UniqueName: \"kubernetes.io/projected/d852f86a-ca7e-49f4-b223-af8574601e18-kube-api-access-lkgg6\") pod \"d852f86a-ca7e-49f4-b223-af8574601e18\" (UID: \"d852f86a-ca7e-49f4-b223-af8574601e18\") " Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.324126 4697 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d852f86a-ca7e-49f4-b223-af8574601e18-must-gather-output\") pod \"d852f86a-ca7e-49f4-b223-af8574601e18\" (UID: \"d852f86a-ca7e-49f4-b223-af8574601e18\") " Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.324862 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlc5w\" (UniqueName: \"kubernetes.io/projected/4cdd46e0-f93a-4157-a5b6-194c3aee0664-kube-api-access-dlc5w\") on node \"crc\" DevicePath \"\"" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.329174 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d852f86a-ca7e-49f4-b223-af8574601e18-kube-api-access-lkgg6" (OuterVolumeSpecName: "kube-api-access-lkgg6") pod "d852f86a-ca7e-49f4-b223-af8574601e18" (UID: "d852f86a-ca7e-49f4-b223-af8574601e18"). InnerVolumeSpecName "kube-api-access-lkgg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.358120 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cdd46e0-f93a-4157-a5b6-194c3aee0664-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4cdd46e0-f93a-4157-a5b6-194c3aee0664" (UID: "4cdd46e0-f93a-4157-a5b6-194c3aee0664"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.420509 4697 scope.go:117] "RemoveContainer" containerID="3bac8c082aa836d42d710220d2eea3c4c71d1485df8d1a65d7aa4cdea6970f95" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.427194 4697 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkgg6\" (UniqueName: \"kubernetes.io/projected/d852f86a-ca7e-49f4-b223-af8574601e18-kube-api-access-lkgg6\") on node \"crc\" DevicePath \"\"" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.427227 4697 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cdd46e0-f93a-4157-a5b6-194c3aee0664-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.450792 4697 scope.go:117] "RemoveContainer" containerID="7fd39bd85f983d77c9ba58e0b051a3139e2e2df6c804f739e2dec7da4ad79694" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.505310 4697 scope.go:117] "RemoveContainer" containerID="3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.526795 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nbm5z/must-gather-xnv6s" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.526803 4697 scope.go:117] "RemoveContainer" containerID="3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.530142 4697 generic.go:334] "Generic (PLEG): container finished" podID="4cdd46e0-f93a-4157-a5b6-194c3aee0664" containerID="2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e" exitCode=0 Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.530223 4697 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54n2r" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.530226 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54n2r" event={"ID":"4cdd46e0-f93a-4157-a5b6-194c3aee0664","Type":"ContainerDied","Data":"2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e"} Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.530824 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54n2r" event={"ID":"4cdd46e0-f93a-4157-a5b6-194c3aee0664","Type":"ContainerDied","Data":"9d153f43eed25e404a53ad8508a21c173418f2569bb782116cc90c6b45fef9fe"} Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.532738 4697 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d852f86a-ca7e-49f4-b223-af8574601e18-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d852f86a-ca7e-49f4-b223-af8574601e18" (UID: "d852f86a-ca7e-49f4-b223-af8574601e18"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.586809 4697 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-54n2r"] Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.594641 4697 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-54n2r"] Feb 20 18:16:48 crc kubenswrapper[4697]: E0220 18:16:48.627955 4697 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_gather_must-gather-xnv6s_openshift-must-gather-nbm5z_d852f86a-ca7e-49f4-b223-af8574601e18_0 in pod sandbox 87f9747e3706596766261215a057968b723ffe79a85ca1c97e867da5a805b192: identifier is not a container" containerID="3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.628015 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20"} err="rpc error: code = Unknown desc = failed to delete container k8s_gather_must-gather-xnv6s_openshift-must-gather-nbm5z_d852f86a-ca7e-49f4-b223-af8574601e18_0 in pod sandbox 87f9747e3706596766261215a057968b723ffe79a85ca1c97e867da5a805b192: identifier is not a container" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.628043 4697 scope.go:117] "RemoveContainer" containerID="3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20" Feb 20 18:16:48 crc kubenswrapper[4697]: E0220 18:16:48.628403 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20\": container with ID starting with 3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20 not found: ID does not exist" containerID="3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.628460 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20"} err="failed to get container status \"3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20\": rpc error: code = NotFound desc = could not find container \"3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20\": container with ID starting with 3deeec006a1b03405e823aa34617cca5ad83bb12f866b36373a38acc2feace20 not found: ID does not exist" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.628476 4697 scope.go:117] "RemoveContainer" containerID="2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.630571 4697 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d852f86a-ca7e-49f4-b223-af8574601e18-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.651708 4697 scope.go:117] "RemoveContainer" containerID="9c47a7abc4cc4cdf5b0c1e8b76cdfe4d47560858c7ccb1ad51e4fc532e773685" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.710141 4697 scope.go:117] "RemoveContainer" containerID="300736777ca28b4825ff116ae6243472368e6ae406b20ddacc6f76bb1abae213" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.756102 4697 scope.go:117] "RemoveContainer" containerID="2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e" Feb 20 18:16:48 crc kubenswrapper[4697]: E0220 18:16:48.757044 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e\": container with ID starting with 2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e not found: ID does not exist" containerID="2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.757095 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e"} err="failed to get container status \"2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e\": rpc error: code = NotFound desc = could not find container \"2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e\": container with ID starting with 2c1ecd96fb61abedbe97dff6c2dd20685bdcf757359728e29d9ab6af6a79437e not found: ID does not exist" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.757128 4697 scope.go:117] "RemoveContainer" containerID="9c47a7abc4cc4cdf5b0c1e8b76cdfe4d47560858c7ccb1ad51e4fc532e773685" Feb 20 18:16:48 crc kubenswrapper[4697]: E0220 18:16:48.763005 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c47a7abc4cc4cdf5b0c1e8b76cdfe4d47560858c7ccb1ad51e4fc532e773685\": container with ID starting with 9c47a7abc4cc4cdf5b0c1e8b76cdfe4d47560858c7ccb1ad51e4fc532e773685 not found: ID does not exist" containerID="9c47a7abc4cc4cdf5b0c1e8b76cdfe4d47560858c7ccb1ad51e4fc532e773685" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.763058 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c47a7abc4cc4cdf5b0c1e8b76cdfe4d47560858c7ccb1ad51e4fc532e773685"} err="failed to get container status \"9c47a7abc4cc4cdf5b0c1e8b76cdfe4d47560858c7ccb1ad51e4fc532e773685\": rpc error: code = NotFound desc = could not find container \"9c47a7abc4cc4cdf5b0c1e8b76cdfe4d47560858c7ccb1ad51e4fc532e773685\": container with ID starting with 9c47a7abc4cc4cdf5b0c1e8b76cdfe4d47560858c7ccb1ad51e4fc532e773685 not found: ID does not exist" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.763090 4697 scope.go:117] "RemoveContainer" containerID="300736777ca28b4825ff116ae6243472368e6ae406b20ddacc6f76bb1abae213" Feb 20 18:16:48 crc kubenswrapper[4697]: E0220 18:16:48.763447 4697 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300736777ca28b4825ff116ae6243472368e6ae406b20ddacc6f76bb1abae213\": container with ID starting with 300736777ca28b4825ff116ae6243472368e6ae406b20ddacc6f76bb1abae213 not found: ID does not exist" containerID="300736777ca28b4825ff116ae6243472368e6ae406b20ddacc6f76bb1abae213" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.763487 4697 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300736777ca28b4825ff116ae6243472368e6ae406b20ddacc6f76bb1abae213"} err="failed to get container status \"300736777ca28b4825ff116ae6243472368e6ae406b20ddacc6f76bb1abae213\": rpc error: code = NotFound desc = could not find container \"300736777ca28b4825ff116ae6243472368e6ae406b20ddacc6f76bb1abae213\": container with ID starting with 300736777ca28b4825ff116ae6243472368e6ae406b20ddacc6f76bb1abae213 not found: ID does not exist" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.886644 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cdd46e0-f93a-4157-a5b6-194c3aee0664" path="/var/lib/kubelet/pods/4cdd46e0-f93a-4157-a5b6-194c3aee0664/volumes" Feb 20 18:16:48 crc kubenswrapper[4697]: I0220 18:16:48.887356 4697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d852f86a-ca7e-49f4-b223-af8574601e18" path="/var/lib/kubelet/pods/d852f86a-ca7e-49f4-b223-af8574601e18/volumes" Feb 20 18:17:01 crc kubenswrapper[4697]: I0220 18:17:01.185282 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 18:17:01 crc kubenswrapper[4697]: I0220 18:17:01.185909 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 18:17:31 crc kubenswrapper[4697]: I0220 18:17:31.185562 4697 patch_prober.go:28] interesting pod/machine-config-daemon-bgvrc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 18:17:31 crc kubenswrapper[4697]: I0220 18:17:31.186323 4697 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 18:17:31 crc kubenswrapper[4697]: I0220 18:17:31.186399 4697 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" Feb 20 18:17:31 crc kubenswrapper[4697]: I0220 18:17:31.187454 4697 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f"} pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 18:17:31 crc kubenswrapper[4697]: I0220 18:17:31.187602 4697 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerName="machine-config-daemon" containerID="cri-o://c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" gracePeriod=600 Feb 20 18:17:31 crc kubenswrapper[4697]: E0220 18:17:31.321866 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:17:31 crc kubenswrapper[4697]: I0220 18:17:31.968968 4697 generic.go:334] "Generic (PLEG): container finished" podID="ba970a98-5bee-40d6-ade6-6dcbed87b581" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" exitCode=0 Feb 20 18:17:31 crc kubenswrapper[4697]: I0220 18:17:31.969046 4697 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" event={"ID":"ba970a98-5bee-40d6-ade6-6dcbed87b581","Type":"ContainerDied","Data":"c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f"} Feb 20 18:17:31 crc kubenswrapper[4697]: I0220 18:17:31.969303 4697 scope.go:117] "RemoveContainer" containerID="2207295b9ca3dd56c27a6f62d8877531cf202677384869ec78ba90d1c35300e6" Feb 20 18:17:31 crc kubenswrapper[4697]: I0220 18:17:31.970020 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:17:31 crc kubenswrapper[4697]: E0220 18:17:31.970297 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:17:44 crc kubenswrapper[4697]: I0220 18:17:44.877757 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:17:44 crc kubenswrapper[4697]: E0220 18:17:44.879064 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:17:48 crc kubenswrapper[4697]: I0220 18:17:48.697190 4697 scope.go:117] "RemoveContainer" containerID="53f53372d0f635c721224c8980f289d54365190679d36095c59571e92d96f2ca" Feb 20 18:17:56 crc kubenswrapper[4697]: I0220 18:17:56.876870 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:17:56 crc kubenswrapper[4697]: E0220 18:17:56.878889 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:18:07 crc kubenswrapper[4697]: I0220 18:18:07.877611 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:18:07 crc kubenswrapper[4697]: E0220 18:18:07.878463 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:18:22 crc kubenswrapper[4697]: I0220 18:18:22.886011 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:18:22 crc kubenswrapper[4697]: E0220 18:18:22.887124 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:18:36 crc kubenswrapper[4697]: I0220 18:18:36.878185 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:18:36 crc kubenswrapper[4697]: E0220 18:18:36.878983 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:18:48 crc kubenswrapper[4697]: I0220 18:18:48.877133 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:18:48 crc kubenswrapper[4697]: E0220 18:18:48.877872 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:19:03 crc kubenswrapper[4697]: I0220 18:19:03.878255 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:19:03 crc kubenswrapper[4697]: E0220 18:19:03.879967 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:19:15 crc kubenswrapper[4697]: I0220 18:19:15.877509 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:19:15 crc kubenswrapper[4697]: E0220 18:19:15.878877 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:19:27 crc kubenswrapper[4697]: I0220 18:19:27.877677 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:19:27 crc kubenswrapper[4697]: E0220 18:19:27.879346 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:19:42 crc kubenswrapper[4697]: I0220 18:19:42.883799 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:19:42 crc kubenswrapper[4697]: E0220 18:19:42.884631 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:19:56 crc kubenswrapper[4697]: I0220 18:19:56.878059 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:19:56 crc kubenswrapper[4697]: E0220 18:19:56.879347 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:20:08 crc kubenswrapper[4697]: I0220 18:20:08.877989 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:20:08 crc kubenswrapper[4697]: E0220 18:20:08.879182 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:20:23 crc kubenswrapper[4697]: I0220 18:20:23.877516 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:20:23 crc kubenswrapper[4697]: E0220 18:20:23.879194 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:20:35 crc kubenswrapper[4697]: I0220 18:20:35.877606 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:20:35 crc kubenswrapper[4697]: E0220 18:20:35.881215 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:20:48 crc kubenswrapper[4697]: I0220 18:20:48.878038 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:20:48 crc kubenswrapper[4697]: E0220 18:20:48.879565 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581" Feb 20 18:21:03 crc kubenswrapper[4697]: I0220 18:21:03.877699 4697 scope.go:117] "RemoveContainer" containerID="c5eea661d9e8606954e89794724044f819d08b44c0e83a6118f0ab5f04493c1f" Feb 20 18:21:03 crc kubenswrapper[4697]: E0220 18:21:03.878726 4697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bgvrc_openshift-machine-config-operator(ba970a98-5bee-40d6-ade6-6dcbed87b581)\"" pod="openshift-machine-config-operator/machine-config-daemon-bgvrc" podUID="ba970a98-5bee-40d6-ade6-6dcbed87b581"